G-Sync vs. FreeSync: Which is Better?

Samsung’s CFG70 monitor with Quantum Dot technology

For a new PC builder or a person out of the hardware loop for a few years, choosing a new monitor may prove to be fairly difficult. These days, dynamic refresh rate technology, which syncs your monitor’s refresh rate to your graphics card’s output, plays a big role in choosing a monitor.

The two big contenders in dynamic refresh rate technology are AMD’s FreeSync and NVIDIA’s G-Sync. Both have their strengths and weaknesses, which we’ll try to address in detail in this article.

Today, most PC users own a 1080p screen, but enthusiasts are increasingly opting for higher resolution monitors. PC gamers are also increasingly interested in monitors with higher refresh rates, i.e. 100+ Hz monitors. Higher refresh rates increase the responsiveness and smoothness of the game, which is most useful for competitive FPS, MOBA and fighting games. Unfortunately, having both a high resolution and a high refresh rate comes at a hefty price.

Dynamic refresh rate technology is another important factor that is often overlooked compared to resolution and refresh rates. AMD graphics cards can be combined with FreeSync screens, while NVIDIA graphics cards can take advantage of G-Sync screens. Both of these technologies strive for the same goal: synchronizing the rate at which your graphics outputs frames with how fast the monitor displays them

These technologies allow for gameplay free of stutters and screen-tearing while avoiding the input lag that comes with using a game’s vertical sync setting. However, the way the technologies achieve this is totally different. NVIDIA uses a proprietary hardware module that is installed into the monitor itself, and AMD uses the VESA standard of Adaptive-Sync, which is part of the DisplayPort 1.2a specification. This means that all displays with DP 1.2a support can have FreeSync with the right graphics card.

Screen tearing in Portal

Apart from vendor specificity, the main difference between FreeSync and G-Sync is the price. You can get a FreeSync monitor for as low as $120, whereas G-Sync monitors start at well over $200.

Another thing to consider is the variety of available monitors. There are more than 100 different models of monitors with FreeSync support, whereas G-Sync monitor models are far less abundant.

There is definitely more quality control in G-Sync monitors, meaning that when you buy one, you will not be disappointed even if you don’t know what you’re getting into. But for price-conscious PC builders, that is often not worth the price.

Why screen tearing occurs

PCWorld has a great article explaining in-depth why FreeSync is beating G-Sync on price and abundance here. In short, G-Sync monitors are expensive not because of the module that goes inside (according to NVIDIA’s Tom Petersen, the cost of the module is “relatively minor”), but because of all the quality control and R&D (research and development) from the monitor manufacturers.

Quality control and R&D raises the prices of the monitors, making them sell less and thus increasing the time it takes for the investment to pay off. FreeSync, on the other hand, does not require any regulations apart from supporting the DP 1.2a standard, thus allowing vendors to create both cheap models and expensive, high quality ones.

The Acer Predator X34 with G-Sync costs over $1000

When FreeSync first launched, it was touted inferior (and rightfully so) for being enabled only in a specific framerate range (for instance 45-60 Hz), meaning that if your in-game framerate was outside of that FreeSync range, the technology wouldn’t work. NVIDIA’s G-Sync didn’t have this problem and is active even in single-digit framerate situations.

A few months after FreeSync released, the problem was fixed with LFC (low framerate compensation). This means that if the difference between the low and high end of the FreeSync range is 2.5 times or higher, then the refresh rate can be dynamic even at framerates lower than 30 FPS. This eliminated the main disadvantage of FreeSync vs G-Sync. It is important to note that not all monitors support LFC, but that will hopefully change with FreeSync 2.

All of this does not make G-Sync a bad choice. NVIDIA was first to allow people to experience dynamic refresh rate in video games. It does show that it is time to move on, though, as better, more consumer- and vendor-friendly technologies come to the market. With FreeSync being an open VESA standard, NVIDIA are able to support it (after rebranding the adaptive-sync standard into an NVIDIA technology) on all of their cards with a firmware update, but unfortunately, that is not likely to happen anytime soon.