9

I have a time authority and I want to securely set a client's time to this authority's time/date within a precision of $\delta$ seconds. The authority's public key is known to the client. This was my idea:

  1. A client sends a request with a 128-bit randomly generated nonce to the authority, and starts a timer.
  2. The server replies with $time\_data$ and $sign(time\_data || nonce)$. $time\_data$ is some representation of time with a high precision and constant length (for example 16 bytes).
  3. The client waits until a response is received or until $2\delta$ seconds have passed.
  4. The client stops the timer, having measured $\Delta t$ since it started, and verifies that ${\Delta t \over 2} < \delta$. Then it verifies the signature using the given time data and the latest sent nonce. If everything passes then it sets the time to $time\_data + {\Delta t \over 2}$. If not go back to step 1.

As far as I can see there is no attack on this scheme that allows any adversary to let the client accept a time that is not within $\delta$ seconds of the actual time of the authority. Am I missing something?

I'm also wondering if there's any way to improve precision above the minimum delay $\Delta t$ of the network without losing security of the synchronized time, similar to what NTP does with estimating network latency.

Ilmari Karonen
  • 46,120
  • 5
  • 105
  • 181
orlp
  • 4,230
  • 20
  • 29
  • 1
    Why not simply use NTP over TLS? – Stephen Touset Aug 07 '13 at 21:21
  • 1
    @StephenTouset I don't know (and haven't found accesible modern literature) how secure NTP is against active attackers. And it kind of blows my technology stack out of the water in terms of complexity, both server-side and client-side. And doesn't NTP use UDP? – orlp Aug 07 '13 at 21:29
  • @StephenTouset Also I just realized TLS will likely be too expensive considering the budget and the expected workload of the server infrastructure. – orlp Aug 07 '13 at 21:53
  • 1
    @Thomas Why? 1) Then nonce serves as a challenge, else an attacker could simply return an old signed time. The nonce is an essential part of this scheme. 2) A signature scheme may or may not be randomized. I don't see how that matters here. – CodesInChaos Aug 07 '13 at 23:14
  • Is key distribution/trust outside the scope of the question? Because an attacker who can forge a signature can violate the $\delta$ bound trivially. 2) Does the client only accept messages that validate with the latest nonce it sends? I assume the purpose of the nonce is to prevent replay attacks, and so either it only accepts replies for the latest nonce or there are missing rules governing which nonces the client keeps active.
  • – B-Con Aug 07 '13 at 23:16
  • 1
    @B-Con: 1) Read: "The authority's public key is known to the client.". Also, what's the point of a signature if an adversary is assumed to be able to forge one? 2) Yes, the client should only accept the latest sent nonce - let me edit the question clearing that up a bit more. – orlp Aug 07 '13 at 23:19
  • @B-Con Without the nonce a MITM adversary could've sent and received a request to the server $100\delta$ seconds ago and then send that as a response to the client's request, obviously violating the inaccuracy restraints. – orlp Aug 07 '13 at 23:29
  • @nightcracker: True, that too. And I had missed your statement (of course forge-able signatures are worthless, was just stating the obvious due to (what I thought was) no info on key distribution). – B-Con Aug 07 '13 at 23:35
  • The server should also send how long it took to perform the signature operation, so that can be taken into account when calculating the differential. – Richie Frame Sep 03 '13 at 17:52
  • @RichieFrame But that data would need to be signed as well! How would you solve that? (it's signatures all the way down) – orlp Sep 03 '13 at 18:02
  • A digital signature operation should take a fairly specific time depending on system resources. If the signing operation is high thread priority, the server can pre-benchmark and send the expected time as part of the signed message. – Richie Frame Sep 04 '13 at 00:22