Did I just write that headline? I plead temporary insanity, especially as I previously argued that nobody cares about the duration of calls. However, that article generated more interest than I expected, so if some of you really do care about the accuracy of metering the duration of a voice call, here is a thought experiment that shows why it is physically impossible to measure the duration of some calls to within 100 milliseconds of the actual duration that the voice channel was usable by both parties. This figure is important because a variance greater than 100ms is inconsistent with some widespread beliefs about the demonstrable accuracy of call metering.
Many factors affect when the start and end of a call are recorded, and hence the calculation of its duration. Most of these relate to how technology works, making it difficult to generalize, but there is one unavoidable factor that has nothing to do with technological choices and will inevitably result in some calls being less accurate than others: the distance between the two parties on the call. Whatever network infrastructure is used, and whatever systems are responsible for generating a CDR, nothing can travel faster than the speed of light. The time it takes to propagate signals will have a non-trivial impact if you are measuring duration in milliseconds. This is because either party may end a call. If the B-party ends the call, then there is no useful channel between the parties from the instant that the B-party hangs up, but there will be a delay before the A-party receives the signal that the call has ended. This propagation delay means that the system that meters the duration of the call will also be affected by the lag before the signal is received, irrespective of the exact specification of that system, and no matter where it is located in the world. Showing that this can introduce deviations greater than 100ms hence comes down to examining some basic properties of speed and distance.
The Speed of Signals
It would be a gross simplification to say the speed of signals are solely determined by the speed at which photons move, but signals certainly do not travel faster than the speed of light, because nothing travels faster than that. The speed of light in a vacuum is 299,792km/s, but it is more pertinent to observe that the speed of light in a typical fibre optic cable will be approximately 200,000km/s. So if we had two users with a single straight 200,000km cable between them, then a whole second must elapse before a signal from one end would reach the other end.
The Distance between Phone Users
We typically phone people who live nearby, but not always. If we traced a line around the circumference of the planet from the city of Edinburgh, in Scotland, to the city of Invercargill, in New Zealand, its length would be 18,792km. So even if an absolutely straight fibre optic cable was stretched across the surface of the planet between those two points, it would take 18,792/200,000 = 0.094s, or 94ms, for a signal to traverse its length. But you may have noticed we do not have absolutely straight cables that connect every city to every other city. To take a more realistic example, consider that the FLAG Europe-Asia (FEA) submarine cable, which connects Miura in Japan to Purthcurno in England, has a total length of around 28,000km, after allowing for some slack in the laying of the cable on the ocean floor. So if there was a call using this cable between users in those two specific locations, it would take 0.14s, or 140ms, for a signal to pass from end to end.
But What about Satellites?
Relatively few international calls rely on satellites, but you might think that using a satellite would be quicker because the signals would move at the speed of light in a vacuum, and they would travel in absolutely straight lines. However, satellites can be even slower because of their altitude. A geosynchronous communications satellite is 36,000km above the planet’s surface, so the signal would need to travel 36,000km*2 = 72,000km just to go straight up and straight down again, without making any allowance for the distance between the two parties to our call. Even at the speed of light in a vacuum, this would be a delay of approximately 0.24s, or 240ms.
Communications satellites can be in orbits that bring them much closer to Earth. For example, the first communications satellite, Telstar, would periodically come within 1,000km of the Earth’s surface, and Iridium satellites orbit at approximately 780km. However, a lower altitude also means a lower range for communication. So whilst only three geosynchronous satellites could connect with the whole planet, the Iridium system uses 66 satellites to obtain global coverage. Given the actual distance between each satellite and the neighboring satellites it communicates with, a call that uses multiple satellites to connect parties on opposite sides of the planet will still endure a significant lag in the propagation of signals.
I hope this convinces you that there are real-life scenarios where the measured duration of a call is never going to be accurate to within a tenth of a second. So whenever somebody starts talking about proving very high levels of accuracy for metering, be sure to ask them about the locations of the phone users in the scenario they are discussing.