1

tl;dr

The question is rather simple: how unsynchronized can internal clocks be between different machines?


Context

We have a "hybrid" approach where the Clients are authoritative on certain things, and the Server on some others. That is to make our life easier because the gameplay doesn't need to be too tight about certain things.

In this case, the Server will pre-determine a sequence of Enemy Types to be spawned at different locations with pre-determined delay between the spawns. We would then let the Clients handle the rest during the gameplay.

However, if we want the Clients to be mostly synched on the display of the enemies, they would need to initiate this sequence at the very same time, and thus we cannot rely on the network to have transmitted the data at the very same time for all Clients.

The idea we had was to send a timestamp of the format returned by System.currentTimeMillis() which would indicate when to begin the sequence.

But then one question remains in the way: how unsynchronized can internal clocks be between different machines?

payne
  • 113
  • 4

1 Answers1

2

Checking the Wikipedia article on Network Time Protocol, we find this Windows documentation as one example:

Under the right operating conditions, systems running Windows 10 or Windows Server 2016 and newer releases can deliver 1 second, 50ms (milliseconds), or 1ms accuracy.

To achieve the tighter 50 ms bound, the documentation says:

The target computer must have better than 5ms of network latency between its time source.

I'd say its fair to assume this condition will not be met for many consumer devices that have not been specifically configured for time accuracy. For most consumer uses, having their system clock accurate within 1 second is sufficient, so I'd expect other consumer operating systems are not dramatically more accurate than this by default.

Software that requires tighter synchronization often implements its own clock adjustment.

So, you should expect that system times will vary between players by multiple frames' worth, and not count on millisecond timestamps to give you accurate synchronization between machines.

Instead of trying to synchronize to a specific moment in wall clock time, you may find it beneficial to try to synchronize to a specific update tick. You can think of your game as though it were turn-based, one turn per server update - it's just the turns auto-advance on their own very very quickly.

This coarser measure of time is more robust against machine variation and network latency. Every client can count up monotonically from their first update tick at game start (or game join, if they joined in-progress, in which case the server needs to stamp the initial game state info-dump with the turn number it represents). Now all clients can agree to spawn the enemy on "turn 153", using the regular rhythm of update messages / fixed timestep beats to ensure they don't get a turn ahead or behind.

DMGregory
  • 134,153
  • 22
  • 242
  • 357