(no title)
lispit | 10 years ago
It can be, if you either are on a fixed hardware platform or have both modest demands and a framelimiter in place to keep things from going off the rails. But in return you pay with a great source of nondeterminism that can cause maddening bugs, break replays, and hinder synchronization in networked games.
>so long as people keep vsync on (why do you need framerate greater than your monitor's refresh rate anyway...)
Two things going on here. Vsync is theoretically a great idea (that worked perfectly in practice on countless simpler platforms in the past), but due to driver flaws and the realities of preemptive multitasking introduces a noticeable additional frame or two of latency on every PC I've used in the past decade or so, whether playing a game or just using regular desktop applications. My guess is that the OS scheduler isn't precise enough to keep applications from barely missing a present deadline (thus causing unnecessary stuttering), so driver devs force triple buffering when vsync is enabled to compensate, giving you a smooth but unresponsive presentation. It really sucks that I have to toggle desktop composition (and thus, vsync) on and off to fix stuttering in one application or tearing in another, but somewhere in the Lovecraftian horror that is Windows, someone screwed up.
The other is "why do you need framerate greater than your monitor's refresh rate anyway," and the answer is "to provide the lowest latency and smoothest presentation possible within the constraints of a preemptive multitasking OS." In a perfect world, a game would know exactly how long it would take to simulate and render a frame, and would wait as long as possible before doing so, so that the most up-to-date input from the keyboard, mouse, and network could be used to display a frame with the least amount of latency factored into it to the player. This is not a perfect world, but you can get a similar effect (at a greater CPU and GPU cost) by rendering multiple superfluous frames, so that whichever one happens to be presented is much closer to the ideal than the one you'd see if you rendered the frame at the start of the 16ms then yielded for the rest of it.
This is part of why you often see "pro e-sports" types turning the graphics settings down to comical levels, by the way. Not only to lessen the threat of a completely missed frame due to a spike in visual complexity, but also so that they can run their game with the framelimiter off.
No comments yet.