NVIDIA Shows-Off G-SYNC Technology on Starcraft at BlizzCon 2013

NVIDIA G-SYNCNVIDIA was showing off their brand-new G-SYNC technology this weekend at BlizzCon 2013.  G-SYNC is a technology for gaming monitors to alleviate screen tearing and VSync input lag, and enhances the capabilities of existing monitor panels, resulting in the.  In the past, NVIDIA has only featured G-SYNC running as a demo in Tomb Raider. At BlizzCon 2013, NVIDIA had G-SYNC set up on ASUS VG248QE monitors running Starcraft with a side-by-side a monitor that had V-Sync enabled on one and not the other. In person, you could definitely tell that the NVIDIA G-SYNC was making the real gameplay video smoother.

In the video, you will be able to see this tearing and the resulting G-SYNC correction. According to NVIDIA, G-SYNC is only waiting on hardware partners such as ASUS, BenQ, ViewSonic, and Phillips to roll out G-SYNC to consumers.

NVIDIA confirmed to us that will be offering G-SYNC do-it-yourself kits for the ASUS VG248QE monitor, which is a very popular 144Hz gaming display that has been out for nearly a year. The upgrade process requires about 30 minutes of time and just a Philips screwdriver. No word on the upgrade module price or release date just yet.


  • rosechelle

    wow why people always so angry in the comments 🙁

  • Dave

    Don’t forget that OLED monitors and TV’s are on the horizon. I’ll give G-SYNC a miss at least until these arrive…

  • Me

    I’d rather spend the money on an extra graphics card and keep my existing monitors. That would be even better at coping with hihjer refdresh rates without the blurring which g-sync gives (it’s a side effect if the framerate gets too low).

    • Me

      That’s just wrong, on so many levels.

  • Polaco

    why would you need a 144hz monitor? isn’t gsync syncrhonizing the screen output frame rendering to video card and viceversa to keep things smooth?

    • Serpent of Darkness

      G-Sync basically regulates the Static Refresh below 60 FPS, and it acts as a frame-buffer on the side to reduce dropped frames. Since the FPS and Frame Time is regulated on a hardware level with the GTX Titan and 700 Series NVidia cards, you can use those components to communicate with G-Sync through the DVI line. Basically the card tells when G-Sync will get a frame, G-Sync says ok, store it, then let it get scanned properly in the 16.667 ms scan-window. Any additional new frames after that get held by G-Sync, or they get dropped by the Monitor. In a sense, it is hardware-based V-Sync too, but the hardware knows when to sync with the scan-windows exactly instead of by sequence on a software level. The thing about 144 Hz monitors (6.944 ms), the scan-window for frames is reduced. The bandwidth of the scan-time is reduced, but this increases the frequency of the scans occurring. It allows you to have more frames displayed in a short interval of time. 16.667 ms versus 6.944 ms. That’s a factor of 2.4 improvement in performance with a 144hz monitor. It’s like a conveyer belt speeding up, but there’s smaller slots or gaps to scan in frames. If you’re using AMD Graphic Cards with their version of AFR, they just mini-gun frames to the monitor, and see if there’s a hit or miss. This is ideal for higher refresh rate monitors. If you had two RX9-290x with 2.0 scaling in a particular game, that and say that game has a performance of 40 fps to 1 card. It would mean the 2nd card would push another 40 fps, for a total of 80 fps (12.5 ms). So basically, 1 in every 2 scans will be frameless, but your eye’s can’t see the frame-less scan at 6.944 ms. The Scans overlap one another, so while your having a 1 to 2 frameless scan, you’re still seeing smooth fluidity between frames because there’s 0% drop-age. So FPS will only show 80 fps. Why would you need a 144 hz monitor. The answer, with G-Sync, is you wouldn’t, but you would be refresh rate capped back down to 60 hz. G-Sync regulates when the fame will be scanned by the monitor. Will it keep things smooth, and it will display all frames with less frame-drop-age. Sort of what RadeonPRO’s Dynamic Frame Control does, but on a software level. It basically locks an FPS, and displays 100% frames coming in with less than 1.0% drop-age. The results would be is that you’d witness smooth, full amounts of frames and animation. At 30 FPS and below, G-Sync is rumored to have ghosting. Meaning that there will be massive amounts of response lag. Probably an issue that will be corrected by updating the firmware, through your computer, on the hardware. Sadly, the truth is about G-Sync, it’s NVidia ways of selling off it’s leftover stock of Tegra4 Processors to a market that is stagnating in the Cellphone/Tablet Market for NVidia. A majority of their sales is mainly in Graphic Card market for workstation and gaming use. They will also try and branch into the server processor market with Intel and AMD, for additional sources of revenue. I don’t think NVidia Shield is selling like hot-cakes, and it looks like the GTX 780 Ti has stagnated Graphic Card sales for NVidia as well… Not a lot of GTX 780 Ti are sold out on Newegg, or other sites. Even though NVidia had a decent quarter on the stock market, I suspect their stocks will drop a little more in Q4, 2014 Q1 until Maxwell comes out. The GTX 780 Ti was a kick in the balls to the NVidia base who’ve previously owned GTX 780 and Titan…

      • Serpent of Darkness

        To add salt to the wound, GTX Titan is still priced around $1000.00 versus the GTX 780 Ti at $699.99. The GTX Titan is still an overpriced, under-performed relic/paper-weight now, retaining it’s previous price. You can claim it’s still retains in value, but there’s really no point in buying it if the GTX 780 Ti is theoretically suppose to pull 15% more performance than the Titan. GTX Titan has 7% less Cuda Cores, and lower clock frequencies. In addition, you get 11.2 API full support, and possibly 3.0 Pcie 16 full support too. You pay so much more for NVidia, just for the name, and sometimes, you get so much more less… If that’s not a kick in the balls by the brand, then I don’t know what is… RX9-290x seems to be sold out more than GTX 780 Ti.

        • logicpolice


        • Storagemountain

          The Titan still has 6 GB of RAM compared to the 3 GB inside the 780 Ti but it still doesn’t make up for the horrendous price of $1000 (with which you could buy 2 GTX 780s, provided you have about a $100 more), since every OTHER spec is better compared to the Titan, like bandwidth, memory clock, CUDA cores and many more.

  • KLMR

    It could be a good start for an open standard, problem is, nvidia will probably don’t want it. It is true v-sync is pretty old.
    That said, showing whatever stuff happens WITH A VIDEO, knowing you want to show tearing, v-sync, g-sync or whatever at 144hz its pretty weird…
    Like the sony bravia tv ads.

  • Dan

    Release a module for ALL, or most modern monitors TV’s and i’ll bite. Many use their HDTV’s for gaming these days, there is no way i could go back to a 24/27 inch panel. And for only a select few vendors? Dumb.. And $400 bucks for a 24 inch GSync enabled monitor? Get the F*** outa here Nvidia..

    • Paul Margettas

      yeah that is pertty fucking retarded. this chip shouldnt cost that much. i thought about asking about it being a external chip but that wouldnt make a difference would it

      • SoulWager

        It’s expensive because it uses an FPGA. ASIC would be cheaper in high volume, but that takes more time to develop.

        It can’t be an external chip, it has to be part of the monitor, otherwise the monitor would just get confused. If it could be done with an external chip, they’d have just made it part of the GPU.

    • fvwfsdhgsfgvd

      are you fucking brain dead you fucking retard?

      does your TV do 140hz? what resolution is your TV? shitty 1080p? thought so. don’t fucking compare monitors to TVs, I don’t like my pixels as large as my face. I’d use a monitor over a TV anyday,. Get the Fuck outa here idiot.

      • Virus

        Who`s going to swap their IPS for TN panels? I see more people adopting this crap on their TV`s than their expensive IPS monitors you Retard.. I guess you should GTFO of here also.

        • PoWn3d_0704

          I did. I have an the Dell u2412m as my secondary to the Asus VG248QE 144hz monitor. After playing with settings the colors on the QE are really close to the colors on my IPS. Plus, I can’t stand gaming on my IPS anymore. 60hz is simply too slow.

          TV’s introduce way too much lag to be used as anything but TVs. There are only a select few that are worth owning as a PC user.

          I’ll be buying the Gsync module for my Asus monitor the second it becomes available.