NVIDIA G-SYNC

TECHNOLOGY EXPLAINED

You’ve all heard of G-Sync and how it gives a smoother gaming experience, but it is commonly misunderstood how exactly it does so. There is more to a smooth gaming experience than just a high frame rate. In an ideal world, a graphics card and monitor would work in perfect harmony, with each frame being displayed by the screen as soon as it is ready from the graphics card. Unfortunately, that isn’t necessarily how things work. In fact, most of the time things don’t work that way at all. Before we can explain what G-Sync is, we need to understand the process of displaying motion graphics on a monitor, which is what most of this article will be about. Let’s dive in and take a closer look.

Most monitors sold today have a refresh rate of 60Hz, meaning that they update 60 times per second, or once every 16.67 milliseconds. Gaming monitors often have refresh rates of between 120Hz (one refresh every 8.33 milliseconds) and 240Hz (one refresh every 4.17 milliseconds). A powerful graphics card might be able to surpass these figures, but the number of frames drawn per second doesn’t completely correlate to the perceived smoothness of the game.

If we take a hypothetical situation where a graphics card renders 50 frames per second and each is drawn 20 milliseconds apart, we will have a perceived 50 frames per second. If, however, the first 49 frames would be drawn 1 millisecond apart followed by a 950 millisecond delay before the 50th frame we would have a perceived frame rate of just a single frame per second. In order for the motion to be perceived smoothly, each frame has to be, for the most part, evenly spaced as per the first example.

With v-sync (vertical sync) enabled, the graphics card will not send a frame to the monitor until it is ready to display the next image, limiting the frame rate to the refresh rate of the monitor. Using a 60Hz screen, the graphics card would have to have a new frame ready every 16.67 milliseconds, which is quite easy to do with a static scene. In the heat of gaming everything is dynamically changing. The likes of explosions, splashes, physics effects and more mean that some frames are more complex than others, and therefore take a longer time to render.

With v-sync enabled the graphics card will wait for the monitor’s next refresh before sending the frame to the display.

As such, the frame timing might be more akin to 16 milliseconds for the first frame (displayed after 16.67 milliseconds), 14 milliseconds for the second (again displayed after 16.67 milliseconds), 18 for the next (but too late to be displayed by the monitor), 15 for the fourth (displayed after 16.67 milliseconds), and so on. That third frame taking more than 16.67 milliseconds means that the monitor doesn’t receive a new image for the frame and will repeat it, dropping the fourth frame to the time that the fifth frame should be displayed. This results in stuttering, which is very perceivable to the human eye.

When the monitor is performing a refresh but the graphics card doesn’t yet have the next frame available, the monitor will display the frame a second time and delay the following frame by one full refresh. This results in stuttering.

The alternative is to disable v-sync, in which case the graphics card will push the frame to the monitor as soon as it has been rendered. If the next frame is pushed before the monitor is ready for its next refresh, the monitor will display the top portion of the first frame and the bottom of the second simultaneously, which results in screen tearing. This is seen as a vertical tear line which becomes very prominent with fast paced action.

With v-sync disabled more than one frame can be displayed at a time, causing screen tearing.

NVIDIA was the first company to properly address this issue, by providing an in-house system board which replaces the scaler. AMD has a similar technology called FreeSync, and both technologies work in a similar fashion, but as we’re looking at G-Sync we’re going to be working with its specifics.

The G-SYNC system board developed by NVIDIA is the heart of the technology, allowing for smooth gameplay.

The G-SYNC system board replaces the scaler. It adds approximately $ 100 to the price of a monitor.

G-Sync works by monitoring how long a frame takes to render and then dynamically (and constantly on the fly) varying the refresh rate of the monitor between 30 frames per second (one frame every 33.33 milliseconds) and the maximum supported refresh rate of the monitor. This synchronization has to be supported by both the monitor, requiring the NVIDIA board, and the graphics card, which has to be a GeForce GTX 650 Ti or above.

If we take a moderately high refresh rate monitor, such as 144Hz, without any form of synchronization anything below 144 frames per second will result in stuttering and anything above will be result in screen tearing. With traditional v-sync, anything below 144 frames per second will still result in stuttering but anything above 144 frames per second will result in the frame rate being capped at 144. With G-Sync, anything between 30 and 144 frames per second will cause dynamic variations of the screen’s refresh rate in order to keep things as smooth as possible, and anything above 144 frames per second will be capped. This results in the best of both worlds.

In this isntance, G-SYNC is synchronizing the monitor’s refresh rate with the frame output of the graphics card for a smooth gaming experience.



FORUM DISCUSSION