NVIDIA G-Sync Demonstrated on 4K Monitor For The First Time At CES 2014


NVIDIA had a large presence at CES 2014 and we were particularly interested in what they had on display when it came to gaming at 4K Ultra HD resolutions.


NVIDIA also custom modded an ASUS PQ321Q 4K Monitor with NVIDIA G-Sync to show off how they can make Ultra HD Gaming (3840×2160) look incredible! We’ve seen plenty of demos with NVIDIA G-Sync running on the ASUS VG248QE at 1080P, but this really takes it to the next level! If you own the ASUS PQ321Q don’t get too excited though as neither NVIDIA nor ASUS have plans on offering an upgrade kit for this model, but there are 4K NVIDIA G-Sync monitors in the works. The demo consisted of a pair of systems running NVIDIA GeForce GTX Titan 6GB video cards running in 2-way SLI with one monitor G-Sync enabled and the other not. There was a clear difference in the smoothness of the monitor and this is the first time that we were able to play a real game on a 4K monitor with the image quality cranked up and not seeing any ripping or tearing!


NVIDIA’s Tom Petersen offered to give Legit Reviews an overview and explanation of NVIDIA G-Sync and the 4K demo that they made just for CES 2014. You can watch that clip above!

  • Wrend

    So.. G-Sync brings down the quality of your needlessly overpriced monitors to better match and mesh with the lower quality performance of your video card(s)?


    And no, 60FPS MINIMUM, 120FPS+ for the elite.


      Where are people getting their information?! The image quality goes up it only gets better…..if you want it to by investing in a technology that offers a far superior gaming/visual experience. And the test bed shouldn’t have any effect on what you decide to do with what is being demoed…

      Obviously if one can’t afford a GSync enabled 4k monitor then in “theory” one would not be able to afford what is necessary to run a game at that setting…

      This tech seriously only make sings look and play better….. And yes if you want it, just like everything else in life you are going to pay for it.

      This aims to make the gaming experience better on whatever game you are playing at whatever settings you have them on. If you can achieve a minimum of 30fps this makes the game play like a dream. You are able to achieve the visual fidelity you get (but better) with v-sync on and the reactive quick twitch response we all want in game. Obviously with more frames you will probably get a better experience, but this is not just for a lower end system or a beast of one.

      This tech really does solve an issue that has been prevalent since the inception of PC’s. And I do like that it is up to the end user to make the choice and isn’t forced on them….. Who knows, maybe one day it’ll get so much attention it becomes a standard for most devices of the future.

      • Wrend

        I didn’t say image quality. I said “quality of your … monitors,” as in, performance potential.

        G-Sync is a patch for a problem that wouldn’t exist if graphics cards were good enough to output at the highest quality levels monitors can perform at.

        I am not disagreeing that image quality will improve using G-Sync. However, in my opinion, it is largely a workaround, not a solution, and hopefully improvement in video cards will make it obsolete sooner rather than later.

        • YOUDIEMOFO

          Do your research on the tax cards…. They are powerful enough today it is just the devs coding and or the drivers along with the API’s they have to use in order to get the programs running on said hardware. You will always have a horrible overhead on PC’s when it comes to raw performance to paper performance.

          The only way to get this solution like nvidia has is to implement what they have done with GSync.

          The monitors and the video adapter need to “talk” together. It does matter somewhat on your ability to spit out 100+FPS sure, but you’re still left with a screen that tears and stutters….. Flat out its a fact, has been happening for years and will keep on happening. Even if you match you monitors refresh rate with your FPS you still get piss poor video quality.

          This is what they’re getting at…. People like me who spend ooooh around 10k on a computer, but I still get shitty video reproduction on screen because of lackluster tech that has been implemented since the days of CRT’s!!

          Like I said before if people can not discern the difference of playing on a 120+hz monitor and that of a 60hz thn this probably won’t help you at all since you can’t perceive what being drawn in front of you fast enough already….. So please show me where you are getting your information from since all of what you have said so far abouT GSync has been about %95 wrong

  • oyoy

    Titan 6GB video cards running in 2-way SLI quote result ‘it’s just chunky, chunky, chunky…’. So Nvidia admit that buying 2000$ won’t solve 4k bad experience with new products (monitors) for 2014-1016? I think they should focusing more on new GTX 800-900 with 4x time stronger card instead of ‘Buy more products on products!’ it’s just odd in my humble opinion.

    • Bob Austin

      If their video cards are too good, they might not make a profit off gsync. So they need their cards to be crappy enough to be good with gsync. That’s why they’re holding off so long on maxwell. A powerful gpu that can run at 60 or 120hz doesn’t need gsync.


        I don’t think that either of you two know what you are talking about. Especially when it comes to GSync.

        It isn’t about performance when it comes to GSync, but when performance drops and dips like it does in most games that is where GSync shines the most. And still, without performance taken into consideration this solution offers a more satisfying experience all around all around when talking about visual fidelity.

        Even if you get your frames up to your magical refresh rate you still have the crappy image torn all over your screen. So yes powerful GPU’s will still require something like GSync to make the image as perfect as possible. And this will said forever because again this has been an issue since the inception of PC’s.

        The solutions brought forth from this tech solves the issues that have been plaguing PC’s and how games have been played for years.

        So yeah if there’s a solution to something that requires the end user to spend more money in order to be satisfied then I say sure. But for the ones that either can’t afford it and or do not want to invest they should just stop hating…… If you can’t afford it that doesn’t mean you should knock it.

        Do some research before you think you should open your mouth or put fingers to keyboard……

        • FUCK AMD

          I don’t think that either of you three know what you are talking about. Especially when it comes to GSync.

          V-Sync bad, G-Sync good. G-Sync more money but wurf.

        • YOUDIEMOFO

          ????? You lost me….. I know exactly what I’m talking about when it comes to GSync.

          Have you seen it in person to truly know what it is capable of and not just on some YouTube screen?

          I know what to spend money on when it comes to what I want and I always do my research. This conception by nvidia is the most acceptable means to end the BS screen issues we have always been plagued with as PC gamers.

          They aren’t relying on any “developers” to make their game more optimized out of the box. All optimization can and will be done through nvidia and their drivers albeit through better dx11 support or going with the “better” choice like OpenGL. And the visual impact is supported by hardware that if the end user decides to buy it is up to them to do so and to reap the rewards.

          So a multi platform game that isn’t coded for specially (IE: mantle) will run as good on an nvidia system as it would on non AMD system, but will look and play much more smoothly on an nvidia

        • YOUDIEMOFO

          GSync optimized system… Ipad is being a wh@re and wouldn’t allow me to edit last post….

        • Wrend

          No, that’s what VSync, FPS limit, and so on are for.

          GSync’s only real use seems to be to make inferior graphics card performance look a little better.

          Think about it this way: If they wanted to, they could just as easily incorporate whatever frame/refresh rate limiting hardware they use in GSync in their graphics cards to match the rate of the monitors being used, given of course that their graphics cards are capable enough to output at that frame rate.

          I have two EVGA 680 FTW+ cards in SLI, and in August plan to upgrade to the two best single GPU cards EVGA can sell me (up to 3000USD). I’m just hoping they have something worthy to offer by then.

        • YOUDIEMOFO

          If one was knowledgable about what v-sync does especially while in game and actually compared it to what this solution by nvidia offers and could not see a difference is pure insanity…

          The performance impact taken from v-sync is not accepted when playing fast twitched FPS games and others that require pin point accuracy quickly. Not to mention your input lag goes up exponentially when utilizing said v-sync. And without v-sync you are left with a tearing and stuttering of frames because there is a loss of “attempted” communication between the video adapter and monitor that was seriously cut off when disabling v-sync.

          If you can not notice a change in control “sensation” or accuracy when enabling v-sync in any game than yeah I’m sure you wouldn’t agree with GSync and what it truly represents.

          It doesn’t matter the frames your PC can spit out, it doesn’t at all. The game that is being played and ha been played on consumers is that you can not, can not get the monitors to talk directly to the video adapter

        • Wrend

          If one can read a comment from another for what it states in the given context, rather than not being able to see beyond their bias, then that one wouldn’t respond to my post as you have…

          The solution is to have video cards that are (1) able to output at a high enough frame rate at a high enough resolution, and (2) are able to match through internal hardware rendering and output the refresh rates of monitors being used.

          G-Sync is not a solution to the inequities of current video card technology. It is only a workaround, as it does not match the image performance of a graphics to the performance potential of a monitor; it does the reverse.

          I hope I have made this clear enough for you to understand now.

        • YOUDIEMOFO

          People seriously need to their homework before thinking they know what is going on here.

          GSync offers the ability for their to be perfect communication between the monitor and the video adapter. There are no more dropped frames, no more half rendered frames, and stuttering because you got more than one frame at a time!

          Only time FPS take account is if you’re below 30 and if so you need to change something already in your rig/setup options in order to play the game half way decently. And that is not because if GSync it is because that is fact that any game goes below 30fps the experience is crap from there on.

          So people keep on spouting off about stuff you have no idea about. Besides I don’t know why it seems as if people are butt hurt over this…. This is a good thing for us consumers. To finally get a break from the horrible draw/display issues we have been having for years.

        • powerwiz

          Finally someone who gets what GSYNC is every stinking post on every review is full of idiots who cant get past the TN monitors there using when they do not realize that TN monitor with GSYNC on it blows there IPS 27inch monitor away.

          What Nvidia is bringing the monitor world is nothing short of a revolution considering that the old method is to simply sell a higher freq refresh monitor which does nothing.

        • YOUDIEMOFO

          I just wish that more people truly knew what it was all about and how great it really is that nvidia is coming out with a solution like this.