![]() Once your rendered framerate is up and above ~90Hz (or roughly 1.5 times the display-refresh-rate), even on a dumb 60Hz display, screen-tearing and waviness of the moving image is much less pronounced than what you could see when the rendered framerate is closer to the native refresh-rate. If my gpu can handle it, I even tend to limit the rendered frame-rate to 121.122 frames per seconds (roughly double the frame-rate of my display) as in making sure thet the display can "grab" the frame that is the most recent at any given refresh. ![]() As then every couple of frames the next displayed frame will show a more recent representation of what is going on in the virtual environment as opposed to - through vsync - buffering and displaying a frame that is already 2.3 frame-times old (in an unlucky moment). Also: if your gfx-card can render 90fps or even higher, it is not a bad idea to make use of it as the display then gets the chance of delivering the virtual world-time more acurately. Thing is, for real-time simulations I have been disabling vsync for a long time as it can introduce 2.3 frames of lag. The steam overlay indicator is constantly jumping between 59 and 60hz unless my graphics-card stumbles over the game and has the rendered "frame-rate" dip below the "refresh-rate" of the display. same with my old display when I enable vsync. Depends on the exact make and model of display and what it's internal electronics run at.
0 Comments
Leave a Reply. |