When One-Way Latency Doesn't Matter

  • There has been a long debate in Special Relativity over whether the one way speed of light is even measurable in principle, with some claiming ingenious experimental setups are able to gauge it while others criticise those for making some assumption about synchronising clocks that presumes the result.

    Then: if one accepts the one way speed of light is indeed not measurable and that only two way speed of light need be homogenous, it will follow that any one way speed is a result of convention in synchronizing clocks and we thinking in terms of 2 way speed is unphysical.

    http://en.wikipedia.org/wiki/One-way_speed_of_light

  • I wish the article would at least point out the cardinal rule of time duration measurement: always use the same clock to measure the "start" and the "end". There's a lot of verbiage but the key point is lost.

    As an example of a discussion where someone made much ado about what turns out to be a clock issue:

    Original article: http://news.ycombinator.com/item?id=5146508

    My comment (which reflected the essence of the article): http://news.ycombinator.com/item?id=5146805

  • In networked realtime multiplayer games you will never see the exact state the game world currently is in, but you will see a very close (based on your latency) approximation of it. In reality, between the network packets you recieve you use alot of tricks like client side prediction and interpolation/extrapolation to make a very good guess at the current state of the game world.

    If your latency and that of the other players is reasonable you dont feel the very minor corrections taking place all the time. Getting this right from an engineering point of view is easier said than done though :)

  • I ran into this in the domain of networking.

    Trying to deduce one-way path latency from the round-trip time ("ping") is impossible without further assumptions. You basically cannot understand if you are on a wire connection or if your uplink is via a sat by looking at just the ping output. You can try an bounce ping off multiple servers, relay pings and gather RTT properties over any graph path that includes your node and test servers, and yet you will always come one equation short for solving the linear system you'd end up with.

    -- However --

    If you make an assumption about some property of some path, then - boom - you can find the one-way latency for any edge on the graph. For example, if you have two test servers that sit on a backbone, you can assume the path latency between them is about the same either way. This gives you an extra linear for the system and this in turn lets you calculate the one-way latency between you and either of these servers.

    It's a pretty neat problem. Makes you think, laterally.

  • The puzzle overstates the importance of figuring out the protocol in practical applications. Why does it matter whether or not you are in a 2:2 case versus a 3:1 case in something like games (their example)?

    The problem of networked games isn't strictly timing. The actual problem is the illusion of real time play. We want networked players to feel like everything is happening in real time. But their changes to the shared world might, due to latency, be mutually incompatible.

    The puzzle is not making a system that is accurate—that's trivial (it involves a lot of waiting). The puzzle is making a system that appears to be in real time.

  • Slightly off-topic, but what was made to make those animated diagrams?

  • OK, it's 4am, so presumably part of my brain is asleep, but I'm missing something here. If the latencies were unconstrained, I would agree that it is impossible to solve the initial riddle. However, since it states that there are only two possibilities, 2s:2s or 3s:1s, it sure seems solvable to me. You have two equations and three variables, but you don't need to solve for the three variables. You can eliminate the skew and end up with a single equation and two variables - the two latencies. That allows you to solve for the difference between the latencies. And since the possibilities are constrained, that is sufficient to solve the problem.

    Of course, in a real world scenario the possible latencies would not be discrete, so indeed the problem would not be solvable. But that's not how it was presented. So either the puzzle was poorly constructed as an analogy for the real-world scenario, or (more likely) my sleepy brain is missing something fundamental. Somebody help me out.

  • The article misses that unequal lag gives less lagged game players the edge