Nvidia’s G-Sync promises better, faster monitors that will revolutionize PC gaming

Nvidia's G-Sync tech, built into a monitor

Nvidia has demonstrated a new display refresh technology that’s meant to move v-sync (vertical synchronization) out of the stone age and create a solution better suited to the needs of modern displays. In doing so, it’s exposed a fundamental problem of technology — oftentimes, our standards aren’t based on any sort of objective evaluation of what’s “good,” but simply built on what worked at a given point in time.

24 frames-per-second in film was standardized because it was fast enough for eyes to perceive as motion, fast enough to keep the highly combustible film from igniting due to exposure to the projection lamp, but slow enough not to cost enormous amounts of money to shoot one movie. The 60 Hz refresh rate we’re all familiar with was standardized because vacuum tube technology needed to run at a multiple of the AC frequency. When we moved to LCDs, we shifted away from using an electron gun to redraw the screen, but still redraw the screen a given number of times per second. Nvidia wants to fix that with its new G-Sync display technology.

The issue, in a nutshell, is that graphics cards don’t render at fixed speeds. While we’ve discussed this our coverage of frame latency issues, those discussions have focused entirely on the video card side of the equation — how long it takes the GPU to draw and output each frame, and how variations in that timing can lead to suboptimal displays. The entire reason we use v-sync, for example, is because v-sync prevents tearing. With v-sync off, you can end up with visual displays that look like this:

Screen tearing

That’s with v-sync off, which means the video card shoves new images to the monitor as quickly as it can. The monitor, in turn, updates as quickly as it can, with no regard for whether the image being overwritten is fully synchronized at top and bottom. V-sync fixes this by limiting the upper frame rate. You can buy a display with a 60-144 Hz refresh rate (the 144 Hz displays are “true” 144 Hz and do not use interpolation as some high-end televisions do.) But refreshing the screen at a higher frame rate doesn’t fix the problem that v-sync starts tearing again if the frame rate dips below the set boundary, too. Nvidia has previously attempted to fix this on the GPU side, by integrating what it calls Adaptive V-Sync, but G-Sync is something different. Instead of just being based on GPU-side timing, G-Sync is a physical chip that integrates directly into a monitor. G-Sync is compatible with all Kepler-based graphics cards, and should be compatible with all Nvidia GPUs going forward.

According to NV, G-Sync synchronizes the graphics card to the monitor rather than the monitor to the graphics card, and promises such smooth gameplay that internal testers “have found themselves overshooting and missing targets because of the input and display lag that they have subconsciously accounted for during their many years of gaming.” By only drawing frames once they’re ready, the display allows for variable frame rates and smooth playback at the same time. Feedback from people who have seen the system in person has been enthusiastic.

Nvidia G-Sync

Nvidia’s G-Sync includes a 768MB buffer combined with a custom FPGA

Nvidia plans to make the technology available two different ways. First, those of you with existing monitors — otherwise known as “everyone” — will be able to upgrade an Asus VG248QE display with a standalone kit. No word on whether everyone who doesn’t own a VG248QE display will be able to upgrade or not, or whether the upgrade will be available on Asus monitors in that family at the 27-inch size.

Otherwise, you’ll be able to buy a monitor next year, at resolutions of 1920×1080 all the way up to 4K. Given the feedback from testers, it seems like this could be a major boon for the gaming industry — and, coincidentally, it’s an NV-only feature. If you think about it, this is damned smart of Nvidia. While a G-Sync module will presumably allow a monitor to work with any video card (just like normal), there’s no reason to think an AMD GPU will be able to hook into the feature and use the specialized capabilities.

Since most gamers tend to upgrade video cards every 2-3 years but may use a display for considerably longer, this increases the chance of a person buying several video cards from Team Green in a row. AMD will almost inevitably answer this kind of project with its own initiative, possibly as an open-source initiative. Whether gamers will want to pay premiums for G-Sync tech is fair question, but I suspect a number will — after all, improved image quality is why people ostensibly buy into monitors, and the boost here, according to all sources, is quite significant.

Now read: Triple monitor deathmatch: GTX Titan, GTX 680, and Radeon 7970 go head-to-head at 5760×1080


ExtremeTechVideo Game News & Rumors On Upcoming Releases | ExtremeTech

Leave a Reply

Your email address will not be published. Required fields are marked *

*