Thanks to both of you for your input, it's helping me “rubber duck” the logic that was here before me and the stuff I've added ontop.
So the rollbacks due to performance problems is of course annoying and I can't do too much about them. But I have noticed that my time adjustment calculations are a bit off perhaps.
Prediction tick time on the client has to be in the future, aka ahead of the server. So to figure out that magic number I want to predict, I do the following calculation:
- Last Server Tick + (rtt in seconds * server tick rate) + maximum unack'd player inputs
Example: server packet received is tick 4, rtt is 30ms plus tickrate is 60hz is 2 rounded, max unack'd player inputs is 10 = tick 16
- if previous predicted tick < 13 (3 ticks less than preferred) = snap predicted time forward because the client has fallen behind
- if previous predicted tick > 24 (6 ticks more than preferred) = snap predicted time backwards because the client has fallen too far ahead
- if no snap has occurred and the number of unack'd commands is too low, we will slow down predicted time in the next frame. And vice versa (speed up if too many)
Perhaps this is far too complicated? I'm noticing slower hardware will end up doing a lot of snapping time forward because they're too far behind. But maybe my definition of “too far behind” is way too strict? In other words using the example of targeting tick 16, if the client's last predicted tick was up to tick 12, then it will snap time forward. Snapping time like that will lead to rollbacks but maybe I just need to redo the whole idea
Thankfully at the moment I don't have any situations where the client is predicted too fast…. yet. Obviously want to support hardware faster than 60fps in the future, but for now I'm targeting the situations where the client is behind
For the record, I believe I originally got this idea from this thread that I read sometime early last year:
https://www.gamedev.net/forums/topic/696756-command-frames-and-tick-synchronization/?page=3