Intel Pentium G3220 Processor Review

Jump To:

Intel Pentium G3220 3.0GHz Dual Core Processor

The Intel Pentium name has been around for quite a long time, it’s a name that used to be synonymous with the latest and greatest processor. These days though, the latest and greatest Intel processors are part of the Intel Core series. Today the Pentium processors are more on the lower end of the product stack, and cost relatively little by comparison. When it comes to the lower end Intel processors, I’ve had a question in my mind for some time now, and finally decided to satisfy my curiosity. How well will a low end processor run a gaming system these days? To answer this question I went to and ordered the Pentium G3220 for a whole $69.99, it wasn’t until I had the G3220 in my hand that I realized that I could save a couple of bucks by hitting and paying $67.63. Where I ordered the Pentium G3220 from is a moot point, what does matter is that the G3220 is less than 22% of the cost of the Intel Core i7-4770K, the top LGA1150 processor!

Intel Pentium G3220 Dual Core Processor Review

The Intel Pentium G3220 is a pretty basic processor by today’s standards. The G3220 doesn’t have all of the bells and whistles of the top tier processors, for under $70.00 you really can’t expect it to have everything under the sun. The Pentium G3220 is a dual core processor that cruises along at 3.0GHz and only fits in the latest Intel LGA 1150 socket. It’s not exactly a rocket ship, but at a minimum it will make your computer go. With a little bit of luck the Pentium G3220 will give us a great end user experience and we will be able to play all the latest games. The main thing we have to keep in mind while looking at the G3220 is that it’s not meant to be the fastest processor out there! It’s meant for those building on a budget.

Before we get to far into the Pentium G3220 performance, let’s take a look at the specifications of the G3220 along side the Intel Core i7-4770K

  Intel Pentium G3220 Intel Core i7-4770K
Number of Cores 2 4
Number of Threads 2 8
Clock Speed 3.0GHz 3.5GHz
Max Turbo Frequency N/A 3.9GHz
Intel Smart Cache 3MB 8MB
DMI2 5 GT/s 5 GT/s
# of QPI Links 0 0
Instruction Set 64-bit 64-bit
Instruction Set Extensions SSE4.1/4.2 SSE4.1/4.2,j AVX 2.0
Lithography 22 nm 22 nm
Max TDP 53W 84W
Thermal Solution Specification PCG 2013C PCG 2013D
Recommended Customer Price Box:$64/Tray: $64 Box: $350/Tray: $339
Maximum Memory 32GB 32GB
Memory Types DDR3 1333 DDR3 1333/1600
Memory Channels 2 2
Max Memory Bandwidth 21.3 GB/s 25.6 GB/s
ECC Memory Supported Yes No
Processor Graphics Intel HD Graphics Intel HD Graphics 4600
Graphics Base Frequency 350 MHz 350 MHz
Graphics Max Dynamic Frequency 1.1GHz 1.25GHz
Intel Quick Sync Yes Yes
Intel Clear Video HD Technology No Yes
Intel InTru 3D Technology No Yes
Intel Insider No Yes
Intel Wireles Display No Yes
Number of Displays Supported 3 3
PCI Express Revision 3.0 3.0
PCE Express Configuraion 1×16, 2×8, 1×8/2×4 1×16, 2×8, 1×8/2×4
Max # of PCI Express Lanes 16 16
Max CPU Configuration 1 1
T Case 72 Degrees Celsius 72.72 Degrees Celsius
Package Size 37.5mm x 37.5mm 37.5mm x 37.5mm
Graphics and IMC Lithography 22nm 22nm
Sockets Supported FCLGA1150 FCLGA1150
Intel Turbo Boost Technology No 2.0
Intel vPro Technology No No
Intel Hyper-Threading Technology No Yes
Intel Virtualization Technology (VT-x) Yes Yes
Intel Virtualization Technology for Directed I/O (VT-d) No No
Intel VT-x with Extended Page Tables (EPT) Yes Yes
Intel® TSX-NI No No
Intel® 64 Yes Yes
Intel My WiFi Technology No Yes
Idle States Yes Yes
Enhanced Intel SpeedStep Technology Yes Yes
Thermal Monitoring Technologies Yes Yes
Intel® Stable Image Platform Program (SIPP) No N/A
Intel Identity Protection Technology N/A Yes
AES New Instructions No Yes
Trusted Execution Technology No No
Execute Disable Bit Yes Yes

As you can see above, there are a number of differences between the Intel Pentium G3220 and the Core i7-4770K. Though as I mentioned above, for only 22% of the cost of the 4770K we can’t expect all the same bells and whistles between the two pieces of Silicon!

Intel Pentium G3220 Dual Core Processor Review

 The retail packaging for the Intel Pentium G3220 certainly isn’t what I would call exciting, but it serves it’s purpose of protecting the processor and giving us some information about the three year warranty.

Intel Pentium G3220 Dual Core Processor Review

This particular side of the Pentium G3220 packaging gives us a shot of the processor itself. Aside from that there isn’t much interesting to see here.

Intel Pentium G3220 Dual Core Processor Review

Along another edge of the G3220 packaging we can find the batch number, L323B628, Product Code BX80646G3220 as well as the various other SKU numbers.

Intel Pentium G3220 Dual Core Processor Review

Opening up the retail package, the actual Intel Pentium G3220 is packed within a secondary plastic case. This added layer of protection helps keep the chip in place within the retail box and is great for handling and storing the G3220.

Intel Pentium G3220 Dual Core Processor Review

Let’s take a look at the test system and get to benchmarking!

Jump To:
1 2 3 4 5 6 7 8 9 10 11 12 13  Next »
  • Sugata Mitra

    I have a gt 730 1gb… overall if i dont change my gpu i have to change my g3220 prcessor right?

  • Dhaval Dn

    I dont play games. Can i notice difference between g3220 and i3 3220?

  • Imtiaz Siyam

    will GeForce GT 610 be alright?

  • Act Arus

    Would you say the G3220 could run CS: GO without too many problems? I’ve got a Nvidia Quadro FX1800 to go along with it, not too great I know but sadly I haven’t got the cash to get a good system. Hope you can help me out!

    • Jake

      I run CS:GO with a g3220 and a R7 265 and I get upto 200 fps in competitive mode. Usually won’t go lower then 120 in busy situations. So it’s definitely playable. In busy dedicated DM servers the fps starts to feel choppy going sub 90 at times. So it’s not at the level of an i5 or anything, but definitely not bad. I’d be more worried about the Quadro FX1800, don’t know how that will fare in a relatively modern game with its slow 768mb RAM. If I were you I’d wait until you get some more money and upgrade the whole thing instead of just the CPU. Hope this helped.

      • Act Arus

        Alright, thanks for the reply! Can you recommend a graphics card that could do the job without getting too expensive? I have about €60-80 ($75-87) in mind as my budget, the R7 costs quite a bit more and is therefore, sadly, not very suitable for my situation. I’m not sure if there even are cards out there that will do the trick for that amount of money… Anyway, thanks for helping me out!

        • Jake

          At that price range you are mostly limited to GPUs with GDDR3 RAM not really suitable for gaming. Your best bet is to look at second hand GPUs. You can probably find a HD7850 or something equivalent in that price range. Good luck in finding something nice!

  • JHG

    Duh another test aiming to compare CPUs and throws in a discrete graphics card. Of course the results are all within 3% of each other, the CPU is barely an issue in gaming, the GPU does by far most of the work. Where can we see a comparison of integrated graphics performance???

  • Speed Master

    which motherboarrd did you us i am using GIGABYTE GA-H81M-DS2 and i cant seem to find the clock frequency

  • Digiadam

    What about temps?

  • Joe H

    I’ve had this CPU for months, and for a HTPC/family gaming rig it is the best you can get for the price. I use an AMD 7850 1GB OC with the G3220, and I have yet to find any games unplayable due to this CPU. The only issue I run into is the bottlenecking related to VRAM of the 7850, but that’s only in modded Skyrim w/ENB and Anti-aliasing can be turned down in everything else.

    If you are looking to build a cheap entry-level gaming rig, save the money on the G3220 for now and put your savings into a higher end GPU and even a better MOBO. When you move up to a more powerful CPU and faster RAM you will be all set for HD gaming glory.

    I highly recommend this processor to anybody looking to pinch some pennies or to build a solid HTPC/family gaming computer.

  • Anon51

    The RAM clock is really not important for CPU case unless you are using integrated graphics. Even the 1600MHz is only achievable via XMP. And RAM clocks is never affected by the CPU multiplier but only the base clock. It is generally non-beneficial to invest in RAM with higher clock speeds as it is expensive and does not provide any significant boost in the system with discreet graphics card. For me an XMP 1600MHz clocked HyperX and equivalent is enough to play games.
    Aside frame rates, most of gamers concern is the loading time for games. Thus for non-overclockers (which is not me) it is best to spend extra bucks for a decent GPU and a 120GB SSD instead of spending an extra $300 for an intel i7-4770K when you didn’t even benefit much from the extra powerhouse in games.
    For those opting for a high end GPU I really do recommend an intel i3-4150 with a Samsung 840EVO SSD which will give full satisfaction when every penny counts.

  • Anon51

    Am looking toward to build my cousin’s PC using this CPU. Did a lookup and compared with the newly launched G3240 and G2030 from the ivy bridge. Since the ivy bridge motherboards are quickly getting out of phase in Malaysia I did consider this one compared to the G3240 since it is cheaper and will give a visibly similar performance in games with decent graphics card.
    Mid-range graphics card (HD7770,HD7790,R7 260X, R7 265 HD6850,HD6870,GTX650Ti,GTX750,GTX 750Ti,GTX650) will give a very similar performance when paired with this processor even comparing with the i7-4770K when compared in GPU centric games such as Battlefield 4, Tomb Raider, NFS etc.
    While High end Graphics card
    (HD7870 onwards, R9 270X onwards, GTX 670 onwards, GTX 760 onwards) will be paired better with CPU with higher threads count such as the i5-4670K and I7-4770K itself to avoid bottlenecks which really (really) ruins the game. CPU centric as ESV Skyrim and Assassin’s Creed IV really needs a suitable CPU-GPU pair up.

  • Armagg3don

    Great micro…. micro!!!!!!!

  • David Tran

    curious as to how well this little chip can feed a multi gpu setup.

  • Ahmed AL-Jaber

    Its easy …. its because no software or game best utilized for pure 4 core. dual core is advance enough.

    why not Intel make 4,3Ghz Pentium G ? maybe it can puch AMD X4 series for low budget user

  • Malloot

    You guys should have tested this in some more cpu intensive games to really find out the difference, Starcraft, Battlefield 3-4 in a 64 player map are the stuff you want to see tested

    • Ivan Popov

      You won’t see difference in Starcraft since it uses only two cores.

  • fridel maconee

    im using this processor.really good for its price.paired together with sapphire hd 7790 with 6gb of ram.can play most modern games no problem 😀

  • ephemeris

    Memory. In your review you utilized “8GB Donimator 2133MHz” memory. The Pentium is rated for 1333 mhz,or 1600mhz.
    Considering an upgradable system,that you might want a new higher spec. processor later . Or by the same token better performing ram .

    [ ] This type of RAM can be ‘clocked down’ ?
    – so that with a new system . The better referenced (Higher Speed) RAM can be purchased. And added sequentially later . *

    [ ] Is this a specificity of the CPU itself,the Motherboard,or the RAM itself . Being able to ‘Clock Down’.

    *I noticed the similarity of the Memory test results. In that both the Pentium,and the I7 were actually doing about the SAME ? so with this:

    [ ] Was the (I7) “8GB Donimator 2133MHz” memory ‘clocked down’ to to make the equivalent platform comparisons for this test ?

    [ ] Why would the “8GB Donimator 2133MHz” be easier to ‘clock up’,using the I7 and not the Pentium ?

    I’m basically doing the consideration of the upgradability for the 1150 where wouldn’t want to buy different memory types . While making them disposable at your first purchase of doing so.
    Not a lot of reviews are saying exactly how they are making ‘settings’for their memories when the stats for most Intel Haswell processors from Intel only state 1300,or 1600hz specs.

    Thanks for review.

    • tb293009

      Your questions are very valid. I was always thinking the same thing but now i understand.

      All ram is limited by is the cpu. Pentium can handle 1333mhz, so the BIOS tells the ram “Hey! You over there! The max this CPU can handle is 1333mhz, so you have to run at that so we can still operate together!” and it “downclocks” the ram.

      Everything mentioned is part of clocking down. BIOS reports 1333mhz to cpu, cpu, bios and memory work in harmony.

      Yes and no. Max frequency for ram on any non-E (extreme) processor is 1600mhz without overclocking. Don’t get yourself started on overclocking, the cloud just becomes fuzzier. The i7 was running at 1600mhz memory, however, memory speed USUALLY does not matter (maybe .5 fps different) unless you are rendering video.

      The i7, simply put, has an unlocked multiplier. This means you can change multiple things, core speed, memory speed, and voltages. So essentially, the cpu can run with 2133mhz memory.

      So, in conclusion, look for some good cl 8 ddr3 1600 mhz for i3, i5, i7. Look for good cl 8 1333 mhz ram for pentium.

      Hope this helps,


  • dude

    for a couple of bucks more you should get the g3420…its faster than the g3220 and supports memory @ 1600mhz

  • marty1480

    Looks really impressive. There I thought dual core was dead in modern gaming. I guess I was wrong. Excellent review and it is nice to see a site review products a little further down the range sheet.

    • JHG

      You could use a Celeron and the results would still be the same. All the work is done by the GPU.

      • marty1480

        Celerons are pretty good, but Pentium is a little better (bigger Cache, faster Speeds). But I agree with you regarding the GPU. But you could get a bottleneck regarding the CPU.

  • Ali D

    A little left to be desired. How would this CPU fare with a GTX 750Ti, Radeon HD 7970 or equivalent or a better graphics card? Now a days 1080p is a standard resolution and with GTX670 it was more than enough. So I just want to know upto how much powerful gfx card we can pair this processor with?

    • JHG

      any GPU will give the same result. The CPU only processes non-graphics data like player position etc which do not scale with GPU performance.

      • Alex_D

        Not exactly, since this is a low end CPU, these medium end cards are going to get held a bit back since the CPU can’t handle all the GPU is trying to process. You would get more performance with a i3-4130+

      • George Cryman

        Sorry JHG but I have to say that you are not correct. CPU does matter too. I did my own test with an unlocked CPU. Underclock and overclock. Everything else was the same in the system, I played with CPU frequency and the gap was huge between overclocked and underclocked benchmark results.

        • JHG

          I’m talking about games, not benchmarks. All D was asking what is the fastest GPU you can pair with the G3220 and the answer is: any GPU. The G3220 is powerful enough to process all the needed CPU tasks of even the fastest GPU in games.

    • No

      I can never use my 750TI to its full potential because of my G3220. Not recommended, anyone who ever saw a G3220/750TI combo, I do not recommend it to you. I don’t know why anyone ever said it is a good idea.

  • basroil

    This thing isn’t much better than a Core2 Duo E8200 from 2007! You could have expected that performance if we were talking about a sub 30W processor, but no love there either. Doesn’t make much sense as a standalone processor, and it’s clear that modern (multithreaded) games will just destroy the chip.

    • Kristian Brødsgaard

      Clock for clock, it’s not far behind an i5, and a lot of applications don’t NEED an i5 or i7. I’ve put one in a HTPC, and it’s actually overkill. System boots in 5-6 seconds, content never stutters, and it even does a bit of gaming on the integrated CPU.

      All the while idling at 28W, and totally passively cooled. (No moving parts at all in the system, and it’s completely silent).

      Does a C2D do that?

      • basroil

        “Clock for clock, it’s not far behind an i5,”
        About 10% slower clock for clock in Cinebench, bit smaller margin when considering gaming.

        From experience, a 4 core i5 3570k is about 2.5-3x faster than a e8200 for blender (closer to 3.5x when talking about very long renders, shorter ones tend to have a large chunk of single threaded unoptimized activity). Comparing this chip to a E8200 would have been a better thing for the rendering part at least.

        “All the while idling at 28W, and totally passively cooled. (No moving parts at all in the system, and it’s completely silent).

        Does a C2D do that?”

        Does one better, the power draw of an E8200 is just 28W at load!

        ( ). TDP for E8200 is about 65W vs 53W in the Pentium, though motherboard in the pentium should draw less power since both the GPU and memory are integrated. However, you should be able to easily have a passively cooled e8200, and even easier to use very low RPM fans (which are so quiet it is effectively silent)

        Best of all, you can often get entire systems based on the e8200 for free, schools/libraries tend to let you take them (minus hdd) so they don’t need to pay for recycling.

        • Kristian Brødsgaard

          Yay, lost my entire post because links open in the same window from comments. Brilliant coding! 🙁

          10% difference isn’t what I’d consider a large difference, given the huge cost difference. It’s close enough.

          As for relative speed between the E8200 and G3220, I have a hard time finding benchmarks comparing them, but an E8200 scores sub-2000 in Passmark, and a G3220 scores around 3200.
          So the G3220 is about 60% faster, and applications that take advantage of the modern architecure (Core2Duo is getting quite old) will probably widen the gap, in some cases significantly.

          Looking at power-draw, note that my numbers are at the plug. So subtract power supply inefficiencies and the rest of the system, and the CPU alone is far below 28W.

          For the E8200 you can’t look at just the CPU, since it doesn’t have onboard memory controller and GPU, so it’s not really a fair comparison. In this review:

          They get 150W idle from the E8200. Bit more than 28W. 😉
          It’s obviously going to depend on what GPU you pair it with and so on, but the point is, you’ll have to add videocard and memory controller to the E8200 to make the comparison equal. The power draw of the CPU alone is worthless, since the CPU can’t do anything without the rest of the system, and looking at the platform as a whole, Haswell does MUCH better than Core 2 Duo ever did.

          As for the last point: True, but you get stuck in an architecture that still uses DDR2, and is probably plain going to wear out at some point soon. Just buying DDR2 RAM for upgrades is expensive and a pain. If your budget is shoestring, it’s better than nothing, but I would never recommend recycling that old hardware if I had the budget to build something more modern and expandable.

        • basroil

          They get 150W idle from the E8200. Bit more than 28W. 😉
          It’s obviously going to depend on what GPU you pair it with and so on,”

          They paired it with a 8800 Ultra that idles at about 120-130W! It’s hard to find specifics on that card (most test systems couldn’t handle it), but it shows (in full system at plug) as using 80-100W more than a 1950 XTX, and that card idles at ~30W.

        • Kristian Brødsgaard

          Fair enough. You still need to add in SOME kind of video, and anything similar to a Haswell iGPU is going to draw maybe 10-15W idle. Memory controller, 5-10W. What does the rest of the system draw? Also, 10-20% efficiency lost at the PSU, so add that in. (You lose quite a bit when you are below 30W.)

          (Also, please bear in mind the 1950XTX is far slower than even the iGPUs of today.)

          Point being, the E8200 only draws 2.5W, yes – not saying that isn’t impressive (and lower than I’d have thought, didn’t recall Core 2’s being particularly power-sipping), but later CPUs add in a lot of stuff that brings power consumption up, but removes it from other parts of the system.
          My 28W system draws anywhere from 20-22W (when PSU inefficiency is calculated), for the entire system.

          Frankly, that’s an amazing piece of engineering by Intel.

        • TunaFish

          I built a new workstation around the G3220 for daily work. My primary target was to keep power consumption down as low as possible since it will be on 24×7. It certainly didn’t disappoint in that aspect.

          I am reading 21W at idle with a 128gb SSD and two 4TB HDDs spun down (using a fairly accurate energy meter). With both cores loaded 100%, I am getting 40-42W. Amazing piece of engineering by Intel indeed. I think the Corsair VS350 is also contributing a lot towards that efficiency.

  • Rahul

    this cpu would be great with GTX 750 Ti.

  • dead

    nice thanks i was looking one for it. they is few things i hope u can help me with . how is the G3220 against i3 processor? and should i buy a graphic card or not and motherboard which one. i am thinking mobo +gpu max of $ 150. if this is fine. which mobo+ gpu. in this range. or mobo only than till $120 n no gpu.

  • EzioAs

    The results for Sleeping Dogs and Metro LL isn’t available. Also, power consumption is listed as Metro LL too