I have a simulation that ticks at 60hz and expect devices that may only update at 30hz to play too.
I've struggled for awhile to handle the performance of rollbacks for prediction with this setup. And since I've been struggling with the engine and low level code doing micro optimizations, I figured maybe I need to reevaluate the higher level problem: the rollbacks themselves..
Does anyone have a particular resource that describes or demonstrates their rollback and reprediction technique for reference?
The way mine is setup is every frame has x number of ticks at a fixed delta time to sample and then predict the players input, which gets sent at the end of the frame. So every frame at 30fps would have 2 prediction ticks. I also do a “fractional” prediction tick which is an additional prediction tick that uses left over frame time to do a smoothed out prediction tick. This fractional prediction tick always gets rollbacked every frame. The others do not.
The others get rollback when the game mispredicts OR their sim falls behind (dropped packets or device freeze/stall). If the hash of the prediction world does not match the hash of the server world, then it must rollback to the server world state. Then, all pending input from the server state up to the last prediction tick are resimulated. On a 30 fps device running a 60hz sim with 40ms rtt (assume 20ms one way clean), I have something like 20-40 inputs not yet confirmed. So I have to resimulated that many ticks in a single frame.
This is killer for mobile devices in particular. Rollbacks destroy their performance. And often it's not even a misprediction, it's just lag. The client has dropped packets or something or a frame stalls, then they have to rollback because they're behind the sim.
Trying to think of ways to soften this! Maybe there's some kind of method of checking input values, seeing if they haven't changed, and merging/skipping certain parts of the resimulation in a rollback?