4 min read

Thoughts on NVIDIA G-Sync

I'll do what NVIDIA did to introduce this, explain the problem with video game graphics and how they're displayed to us. Skip below to the "thoughts" section if you're already familiar with this.

What?

Computer monitors and TVs typically run at a set refresh rate, however GPUs don't render each frame at set intervals, they render them as fast as they can - in this case there is a disconnect between a refresh on the display and the framerate - resulting in screen tearing (part a previous frame is displayed with part of the next frame, resulting in a tear line, which is generally pretty ugly).

The solution was vsync, make the GPU wait to send those frames at a set interval to match the refresh rate, you get a nice smooth image. Problem solved!

But then you realise all that waiting around the GPU is doing has introduced lag, specifically input lag, some games it isn't noticeable, others it is and it is unbearable. Not only that, but if your GPU can't match the refresh rate (it is not powerful enough, for example) then vsync locks it down to a lower FPS (e.g. 45/30fps), this can happen very frequently and this constant large change in framerate results in an noticeable and annoying stutter.

Another solution! adaptive vsync a sort of innovation by John Carmack and implemented at driver level by NVIDIA, this is a way to try and avoid a these few scenarios:

If your GPU has enough power to meet the refresh rate, vsync is enabled and you get no tearing. But if you begin to drop, vsync is automatically disabled to avoid the stutter, you get a variable framerate below that, but it isn't automatically forcing large drastic changes, that variable framerate introduces the tearing back into play, but it is the lesser of the two evils.

Adaptive vsync is probably the best solution we have now, but it is a solution that is based around compromise.

NVIDIA has found what seems like the solution, and it is stupidly simple in concept. Make the GPU and display work together! Don't make the GPU work on the display's terms, make the monitor work for the GPU - the GPU is the more important partner in that relationship.

Let the GPU control the refresh rate of the monitor so that all the frames rendered by the GPU are lined up with the refresh rate with the monitor, no matter what the framerate is. The only limit here is the refresh rate of the display (up to 144Hz at the moment).

If your framerate is 86fps in a particular instance, your monitor will be 86Hz. The good thing about this isn't just solving those problems, but giving a smoother image where previously it wasn't particularly obvious, e.g. running a game at 45fps.

Thoughts

Super excited. It depends on a chip in the monitor that NVIDIA have said will be available from various manufacturers, and will even be provided as part of a DIY kit for a certain ASUS monitor (and potentially others).

This is fine, monitors aren't capable of this at the moment, we need extra hardware to do this. Where my concerns begin are the fact that this will be vendor specific, I'm on a 780 at the moment, I can and may end up getting a monitor capable of this, but I am concerned for the same reasons I am concerned about AMD's Mantle.

The difference here is that this may end up being one of the biggest innovations in the area for quite some time, AMD and potentially Intel aren't going to leave this unanswered.

This is potentially a bit more annoying because:

  • We may have AMD and NVIDIA supported monitors. AMD will answer this with their own, perhaps Intel (although their graphics generally address a different market). Perhaps most will both have the hardware required for G-Sync and an AMD solution, if so:

  • Would that drive up prices? Perhaps for a while, but I suspect this hardware will reduce in price fairly quickly, but even then, so that isn't really an issue:

  • Then this has a potential lock-out effect on any potential newcomers to the market. It seems very unlikely at the moment we'd have anyone to attempt to break into the PC graphics market, at least at the end we're interested in, but it's an uncomfortable thought regardless.

I'm concerned on the effect of this sort of stuff on the industry - fragmentation hurts PC gaming, we need vendor neutral solutions.

It's really difficult, because who will develop these vendor neutral solutions? Companies like Valve, Oculus, etc, seem like the best candidates, but I'm not sure how well set up they are do to that sort of stuff. Oculus is a small startup without even a single consumer product on the market yet, Valve are still quite a small company, but with their moves with SteamOS and the Steam Machines they are going to become more important in this part of the industry.

NVIDIA and AMD aren't going to spend money developing stuff for the "good of the industry" if they aren't as a result going to drive people to buy their cards. Completely understandable and justifiable, especially in cases like this - it isn't as if they're extended existing functionality, this is brand new hardware that is required for this to work.

I'd like NVIDIA to extend an arm out to third parties for G-Sync, perhaps find a way to add value for customers who choose NVIDIA cards, whilst providing the basic functionality for all vendors to utilise - fragmented industry harms everyone, and this is an industry NVIDIA and AMD probably have reason to protect.