(no title)
ultrahax | 2 years ago
To my knowledge, the client timestamps their inputs and sends them to the server; the server will then rewind the state of the world to the time of the input before applying it. RTT isn’t an input. Each snapshot from the server includes the server world timestamp of that snapshot; the client will gently lerp its clock to match this per frame.
Source - I’m a COD engine developer the last ~15 years or so.
asmor|2 years ago
My info might be outdated, but I've noticed that on asyncronous routes, there seems to be a large bias that's based on on assuming upstream latency == downstream latency. It might just be the clock not getting adjusted (even most NTP imlementations make this assumption), but it also has been since ~T7 that I even checked. Conditioning the network to add ~40ms to downstream latency could actually reproduce this behavior.
People don't really realize how hard of a problem sub-10ms clock sync can be on cursed networks.