Graphics refresh rates and VSYNC

From Wikiid
Jump to: navigation, search

The questioner asked why you'd want to lock your frame rate to VSYNC - and what were the benefits of running a game at higher frame rates than the monitor could draw?

Higher Frame Rates than the Monitor can Draw.

Running the game at a frame rate higher than the monitor can refresh has advantages in some cases. Notably: Most games do everything once per frame - so they'll check the mouse once, the keyboard once, update the player's movement and each of the AI's once then draw the graphics once. If it's running at (say) 60 frames per second, that means that it'll read your keyboard and decide what to do about that 60 times per second - that's once every 16.7 milliseconds. If you are lucky then you press the 'fire' button the moment you see the enemy appear and at that exact moment, the computer reads your keyboard and tells the server "HE FIRED!!!". If you are unlucky, the computer reads the keyboard, then you press the key just a fraction too late, then 16.7 milliseconds later, the computer reads the key and tells the server that you fired. On average, it's about 8.3 milliseconds between you hitting the key and the computer noticing. Of course in that amount of time, the other player may have moved some more - so maybe you missed. If you crank up your frame rate to (say) 100Hz (if the computer can handle doing everything 100 times a second instead of 60 times) - then the computer reads the keyboard ever 10 milliseconds - so the average delay between hitting the key and the computer reading it is only 5 milliseconds. The other player will have moved a smaller distance over 5 milliseconds than he would have done over 8.3 milliseconds - so on the average, you shoot more accurately than you did at 60Hz. HOWEVER, your graphics don't look so good...maybe that's a bad thing.

What is VSYNC?

The business of VSYNC is a bit more complicated. Inside the graphics card, there are two copies of what you see on-screen, there is the "Front Buffer" which the graphics card is reading from in order to form a picture on your monitor or TV - and the "Back Buffer" which is where the computer is drawing the next image. What's in the back buffer at any given moment will be an incomplete scene - maybe the buildings have been drawn and the sky but not the evil aliens of the laser zaps or whatever. When the computer finishes drawing a complete scene, it swaps the front and back buffers over (or perhaps copies the back buffer into the front depends). So what's in the front buffer becomes the freshly drawn picture and the back buffer can be erased and a new picture started ready for the next frame - this is called 'Swapping the buffers' - or 'buffer-swap' for short. The 'VSYNC' thing relates to when, precisely the computer does that buffer swap. Remember that (at 60Hz), your monitor is drawing the picture line by line down the screen as it scans out the raster (we call this 'painting the screen'). If the computer were to swap buffers as the raster is painting halfway down the screen then the top half of the picture would represent where everything in the virtual world was one frame ago - and the bottom half of the screen would show it as it is now. If something is moving fast across the screen or (especially) if the camera is moving quickly - then the screen looks like it's got a tear across the middle because the top and bottom parts don't line up properly anymore. The 'proper' way to fix that is to have the computer always do the buffer-swap when the monitor has just finished painting at the bottom of the screen and is zipping back to the top to start again. This is called "the vertical retrace interval" - and there is a signal called 'VSYNC' that determines when that is happening. So - if the buffer-swap is made to happen when VSYNC happens then there is no tearing on the screen because each new raster is repainted with one entire picture.

However, the problem with this is that the computer has to wait for that VSYNC signal before it can buffer-swap - and that's a waste of time. So if the VSYNC is happening at 60Hz (every 16.7 milliseconds) and the computer is ready to swap after 10 milliseconds, it's got to sit there twiddling it's thumbs waiting for the slow old monitor to get done painting the raster. Well, if your computer is generating frames at better than 60Hz - then maybe you don't care that it's waiting for the VSYNC and therefore never able to go faster than 60Hz. However, what happens if your software is taking MORE than 16.7 milliseconds? Suppose it's taking 18 milliseconds. In that case, when the VSYNC signal pops up, the graphics card hasn't finished drawing the picture yet and so the buffer can't be swapped. Instead it has to wait until the following VSYNC - which is 33.3 milliseconds after we started drawing - so when the computer finishes drawing after 18 milliseconds - it has to wait around for another 15.3 milliseconds before it can swap the buffers. Then off we go again with the next frame - and the same thing happens. So the frame rate (which at 18 milliseconds would have been a reasonable 55Hz) has now dropped to half the rate of the monitor's VSYNC - which is 30Hz! In fact, if your software is running anywhere below 60Hz, it'll drop all the way down to 30Hz. If it's slower than 30Hz then it'll drop down to the next sub-multiple of 60Hz - which is 20Hz (three VSYNC intervals), 15Hz (four VSYNC intervals) and so on.

So here you have a trade-off. If you lock the software to swap the buffers at VSYNC then you risk making the frame rate slower than it could have been - but if you don't lock it, you get the "tearing" artifact. It's largely a matter of personal preference.