Advertisement

Problem with processing inputs on the server

Started by August 14, 2024 05:31 PM
5 comments, last by hplus0603 3 weeks, 4 days ago

After reading these two articles:

https://www.gabrielgambetta.com/client-server-game-architecture.html

https://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking

I have implemented a working system for client side prediction, and am now doing entity interpolation.

A problem I encountered is that the players' movement, even when moving at a constant speed, sometimes appears jumpy.

I understand why this happens. It is because the server doesn't always simulate the same amount of inputs between every broadcast of the players' positions. Which in turn happens because of the client not always sending the same amount of inputs every time.

This is mentioned in valves article: “[…]This means two or more user commands are transmitted within the same packet.[…]”. Although they only say that there will be more than 2. If my client sends 3 inputs in one packet it will send 1 input in the next packet.

Anyway. If the server consumes the inputs that it receives every simulation step (which is called at the same rate as the client is sampling inputs), the players will some frames be faster than other frames. Even though they locally appear to be at constant speed for the client that predicts its own movement.

How can I properly approach this problem of processing inputs on the server?

None

There are several approaches. Lockstep with all of its pros and cons. Rollback and resimulation is sometimes used with various pros and cons. Accepting one as truth and all others interpolating/popping the others into position is another with pros and cons. Declaring one as a cheater and ending the game is an option with various pros and cons. And you can come up with other options, with their associated effort and pros and cons.

Details will depend tremendously on your game. How much validation is needed or expected? There are plenty of games where play is non-competitive so severs are very lax, there are also games where play is highly competitive and an absolute source of truth must be followed, and there is everything in between.

Advertisement

I think I solved it. Instead of having the server process multiple inputs every tick, it should only process 1 input every tick. This way the variation of the amount of inputs does not seem to matter.

I thought I had tested this before and seen the same problem, but apparently not.

None

Normally it's best to process all the data that's been fully received to that point, and buffer anything that is still streaming over the wire.

For example if you've already received 2.3 complete messages in a stream, process the two full ones and leave the remaining 0.3 for the next pass through.

Hopefully you're using a good networking library that already sorts out your data transmissions for you, but it is typical to have systems establish multiple communications channels that are continuously processed: typically reliable streams, reliable packets, unreliable ordered packets, and unreliable unordered packets. That makes it easier to categorize how much to process, and when.

This is a server side input processing issue causing uneven movement rates and "jumpy" movement interpolation on client side. Server processes different amount of inputs between its updates and players movement is uneven.

[quote]It is because the server doesn't always simulate the same amount of inputs between every broadcast of the players' positions. Which in turn happens because of the client not always sending the same amount of inputs every time.[/quote]

It seems likely that you have not fixed your simulation step size/time then.

Typically, you will want to either have a fixed step size, and if you render slower than your simulation rate, you simulate more than one step between each frame. This will generate more than one “the input state at tick X was Y” message. (Those RLE compress very well usually though.) The server will then dequeue input state messages one per simulation tick as well. Ideally, you timestamp the inputs, such that “this is input from player A at tick X” so the server can detect drift over time.

If you don't use a fixed step size, you will need to time how long the time is between each rendered frame, and include that information together with the event. The server then lays out the timeline such that it knows what to do when an input even “ends in the middle" of a server simulation tick. Again, you will need a queue, and you will need to lay out possibly multiple events over the timeline of simulation.

enum Bool { True, False, FileNotFound };
Advertisement