5 min read

Adaptive refresh is a necessary fix

Adaptive refresh is a necessary fix

.. that we aren't going to get, yet.

Linus (of LinusTechTips on YouTube) said something which resonated with me which I think is a good way of summing up the effect of adaptive refresh, which was (paraphrasing) - before you got an SSD, you didn't think you needed an SSD. You had a hard drive, your computer worked fine, SSDs are expensive! But if you ask most people to go back to a hard drive based system (for OS + applications) they'll find it unbearable. I know I do.

As such, whilst I believe adaptive refresh is a fix to a much more fundamental problem, after 4 days with a G-SYNC monitor I think the effect is similar. Once you have this you won't want to go back, you'll wonder how you ever coped. It's not a shiny new feature, it's a fix. It's a fix to something that we've had to put up with forever, many may have even gotten used to it so much that they don't even see it as a problem anymore!

But on to what adaptive refresh actually is. It is of course, what you probably know as G-SYNC or FreeSync, I choose to give it a vendor neutral name in this context. It solves three main things:

  1. vsync on/off related issues
    • Screen tearing
    • Input lag
    • Stutter
  2. The need for a game to perform to the set interval refresh rates of a traditional display.

I detailed my thoughts on G-SYNC along with a brief explanation of what it is and how it solves this problem, upon its announcement here, but I would like to emphasise the importance in some context.

There is a bit of an on-going debate within the video game industry at the moment due to the new consoles, 30vs60fps, it isn't a new debate. Ignoring all the reasoning as to why this has sparked up again and why the demand coincided with the release of new consoles, let us just look at how adaptive refresh could make this debate irrelevant. It's a basic explanation of the benefits, but one I'd like to give the context of console games of.

We're no longer confined to the set intervals of traditional displays.

A developer doesn't have to ensure their game reaches a 60fps most of the time, they could have an average of 54fps and it'd look pretty much just as good as 60 to most people.

A game on a traditional non-adaptive refresh display running at an average of 54fps wouldn't look great. You would get screen tearing and stuttering. But if your game is running at 50fps in one particular instance, then your refresh rate is at 50hz. You don't get the tearing or stuttering.

That is main reason why 60fps looks so much better than 50fps on a 60hz display, because each frame is in sync with each refresh.

What would be the impact on game design?

The primary advantage I believe would be the relaxed use of resources. You're given more freedom to strike that balance between performance and graphical fidelity and more choices to choose when to tip that balance either way in different scenarios within your game.

Wild flucuations in your framerate wouldn't be a great experience either, but being able to run a certain event in your game at 45fps in order to achieve a certain level of graphical fidelity that is necessary for what you're trying to achieve in that scene, whilst bumping it back up towards a higher 55 - 60 in scenes that benefit from the higher framerate.

Why stop at 60fps/hz?

Exactly. Why? We don't need to. We have PC monitors that truly refresh at 144hz and as such we can play and see games running at 144fps, and after the 4 days with this 144hz G-SYNC monitor, I'm a believer too.

There would be clear benefits of adaptive refresh rate on 60hz displays, but I think the benefits are even greater on a display a higher refresh rate. Not only do you get an even smoother game, but you also get even more room for the developers to work within for the reasons in the previous section.

Many people would say they couldn't deal with going from 60fps in one scene, to 140fps in another, and maybe that's true, but in my experience of PC games that rarely happens. Unless there are technical issues with the game, you're likely to find that your flucations will be around 30 - 40fps in the 'bad' cases and those with an adaptive refresh rate monitor aren't jarring, you notice the increased or decreased performance but it doesn't have the jarring effect that you normally feel.

This is all on PC - where your framerate is all but guaranteed. I'm talking about consoles here, where one of their advantages is the tightly controlled development environment, a developer can control this experience and ensure the flucations in framerate don't deliver a subpar experience. They could gradually increase or decrease the average framerate, delivering you better performance or graphical fidelity for the different situations.

So why aren't we getting it?

We do have it - but only on PC. The open platform and market for enthusiast gaming hardware allowed this to come to market relatively quickly, but the same cannot be said for the console market.

I'll deal with the easy one first. Higher refresh rates aren't really found on TVs. You get the modes for active 3D and the odd motion interpolation, but they're often not there to accept a true higher refresh rate signal from a source (e.g. a video game console or a PC), often the panels are capable of, for example, 120hz but not all do, and the ones that do are undocumented and unsupported.

The second is the hardware and standards requirements to allow for the communication channel between the source and the TV, in order for the source to control the refresh rate of the TV on demand. This requires hardware in the TV to support this, a supported communication channel (e.g. over HDMI, DisplayPort or other connector) and the source to support it (your games console).

The TV industry at the moment is a commodity market and their primary focus is on television content, convincing them to include a not insignificant change to their displays solely for the benefit for video games would be difficult, especially when the TV manufacturers aren't the same companies that make video game consoles (especially now that Sony is spinning off their TV business).

Thirdly, I don't suspect this kind of change would happen within a generation of consoles. There is theoretically not much stopping them introducing it on a PS4 or Xbox One, for example - it just seems unlikely given how the development of revisions is typically handled.

This would be functionality to be introduced upon a new generation, and would it be introduced without support from the TV manufacturers? I'm not so sure. But if it takes its hold on PCs, and becomes the norm, I could see happening. TV manufacturers often also manufacture computer monitors so if adaptive refresh becomes the standard in the PC world then logically it seems like a much smaller step to see it introduced in their TVs.

But in my first sentence of this post, I said we aren't getting it yet. The above is why I think that is the case.

That is more than a little sad, it's an important fix that would deliver significant technical improvements to video games - which are a technical medium. I just hope I can be optimistic in thinking that it will happen for consoles, just not for a very long time to come. I don't even want to entertain the idea that it'll be ignored and dismissed.

But I do fear without the demand there is that risk. Who will demand a fix for a problem they have accepted and are just used to? Probably not your average consumer. Developers are probably our only hope.