«

»

NVIDIA G-SYNC Technology: An In-Depth Look


NVIDIA G-SYNC Technology: An In-Depth Look

By Tyler Nulty

A very special new piece of hardware technology has recently been released at a conference in Montreal on October 22, 2013. The new technology is called G-SYNC, by NVIDIA. G-SYNC seems to be the answer for a major complaint from PC gaming community regarding image quality in fast-moving video games. Tears, stutters, and lag are told to be totally eliminated with this new technology. NVIDIA’s CEO Jen-Hsun Huang put on a compelling performance at the conference in Montreal this past Friday. He announced some exciting new technologies, GameStream, GeForce GTX 780 Ti, ShadowPlay, and of course, G-SYNC.  In this article I will give you my explanation, thoughts and opinions on this new tech called G-SYNC.

NVIDIA-G-Sync-CEO-Presentation

I am sure most of you out there, like me; have been playing a game or using a graphic intense program when all of a sudden the image on your screen will, tear, lag, stutter, or just generally glitch out. This is a problem that has been produced via poor connections between various forms of hardware. G-SYNC aims to solve this problem once and for all.

NVIDIA G-SYNC Explained

If you have made it this far into the article, I will attempt to capture the essence of G-SYNC and how it works. First we need to gain a small background knowledge in some of the related hardware that can cause stutter, lag, tearing, and glitching.

The most basic explanation of the hardware associated with G-SYNC is the connection, or pairing of the graphics unit and monitor. The graphics unit puts out a signal to the monitor, and the monitor displays the signal in a graphical form. These signals, or information bits as I will call them; are basically little bursts of electricity sent through a metal medium. Some signals are transferred using light, or optical mediums, but that is a whole other article to be written in the future. Both the graphics unit and the monitor have dedicated computer modules inside them that send and receive these little bits of information. The problem we were faced with is when these two modules (one in the graphics unit and one in the monitor) can not properly communicate with each other, causing an error (stutter, lag, tearing and glitching).

NVIDIA-G-Sync-Circuit-Board

When the information bits travel from the graphics unit to the monitor sometimes the little bits of information get lost or backed up. For example the signal from the graphics unit can be too strong or too hot for the monitor to accept. The design of each module is usually not fully compatible with the other. Below Is an example of stuttering on a laptop from 2007. The Colored arrows point out how the dotted selection line has stuttered or repeated its presence on the screen multiple times in a given span of time. The moving of the dotted selection line normally is smooth. Clearly the graphics unit and the monitor on this laptop were not in SYNC.

PHOTOSHOP-Stutter-effect

The synchronization of the graphics unit and monitor is what G-SYNC aims to do. To do this the G-SYNC module will sync the timing of the signals sent from the graphics unit to the monitor. Imagine a four-lane highway with traffic traveling down it. Each car is a bit of information, each travelling at is own different speed from the others. That causes back ups as everyone who has traveled a distance on a highway knows. That is basically how most graphics units and monitors send information bits, this can cause errors. Now imagine a four-lane highway where everyone is traveling the same speed, hard to imagine, I know. That is NVIDIA G-SYNC in action.

NVIDIA G-SYNC Final Thoughts

Now while I might not get stuttering and lagging all the time while using my computer, it does happen. When it does I can get pretty upset and confused. Its just one of those things I might not realize was so bad until my whole day was thrown off by a small glitch. I think the gaming community can sympathize with me on that note. I imagine a whole bunch of gamers are now rejoicing after hearing the news of G-SYNC and its function. G-SYNC will not only help gamers level up and take down the big boss, G-SYNC will have very practical uses in other areas of computing. For example in CAD or computer aided drafting, G-SYNC could have tremendous benefits when rendering, creating, and viewing complex 3 dimensional objects within software programs.

NVIDIA-G-Sync-Monitor

With the introduction of G-SYNC, more doors are being opened for programmers to use the utilize the way G-SYNC facilitates communications within graphics unit to monitor configurations. Having the graphics unit refresh rate tied directly to the monitor can aid in rendering very complex 3D worlds without error. NVIDIA G-SYNC technology can enable graphics units and monitors to synchronize both the refresh rate and the frames per second loaded while still being able to keep the variable frame rates most games run. Unlike many other graphics units and monitors that refresh at different speeds, the G-SYNC is a module in the monitor that will slave the graphics unit to make sure the frames coming into the monitor are in order, and traveling at the proper speed.

NVIDIA G-SYNC Opinions

My opinions of this new technology will be brief. I have yet to have test the new hardware technology called G-SYNC hands on, though I will give my best opinions with the knowledge of I do have of G-SYNC. It is a very cool technology that will make up the future of gaming tech in the monitor arena. It is expected to be used with the new 4k monitors, which I can not wait for! G-SYNC paired with a 4k monitors will have incredible results. Needless to say G-SYNC’s quality, resolution, motion, shadows, and detail will be absolutely incredible, 4K monitor or not. I do hope the programmers will be there for this new technology. For without the programmers we won’t be able to see what this new tech can really do. I think NVIDIA’s teaming with four major monitor manufacturers is beneficial for all of us. With Asus, BenQ, Phillips, and Viewsonic all planning to use G-SYNC, we can say for sure it will be available to more gamers and PC users alike in the near future.


1 comment

  1. Game2Live

    As someone that is serious about my gaming on the surface this sounds amazing but I have to say I am skeptical. NVidia told the world how vsync was broke and they would fix it with their adaptive vsycn, we all know what a boondoggle that was. Now they are claiming they have fixed it again but this time we need to spend even more money for it.

    I can tell you in my gaming I have only once within the last few years seen any kind of tearing that had ANY impact on my game play, the funny thing, it was while using an NVidia card and their adaptive vsycn. During my game play without adaptive vsycn or using an AMD card I have not ever had my play effected by the issues you described and I game a LOT.

    I also do a lot of photo editing and while I can imagine if you are busy doing some kind of super rapid scroll through a large photo, never during actual editing have I NEVER seen any kind of tearing as you described or even posted in your picture.

    This sounds like a neat idea but since it fixes an issue that is not an issue for 99% of users, is it really worth it? Is fixing an issue that is not an issue really a fix?

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA Image

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>