A Public Test of Network Congestion? Or a More Sinister Use of Data?

Normally we think of data as something businesses gather as a result of engaging in lots of transactions with customers. But if lots of customers gang together, they could also collate lots of data about business. The internet then allows them to share the results each time they ‘test’ a business, meaning the public gains new insights into how businesses are performing. The most familiar examples are subjective review sites, allowing customers to rate everything from the latest cinema release to the view from their hotel window. A lobbyist group with the pompous name of Battle for the Net (one of their subtler publicity images is shown above) have gone one step further, encouraging ISP customers to perform objective tests of network interconnections. Their aim is to collate data that will reveal where congestion is occurring in a network, saying this will help them enforce net neutrality. But will it work? And is BFTN a front for another organization with more selfish objectives?

Visitors to BFTN’s website are offered an ‘internet health test’. This is instigated by a simple click of a button. When I conducted the test, all I saw was a rather underwhelming internet speed test, with the conclusion that the download and upload speeds were just what I expected them to be. I was also told that my ‘valuable open data’ was helping to keep the net neutral, the skies blue and the oceans wet, without telling me exactly what data had been collected. What really annoyed was being presented with a link to the relevant data policy after pressing a button that immediately started the data collection. That makes me think less of feel-good warriors for justice campaigning against evil corporations, and more of huge data-grabbing corporations like Google… a topic I will return to below.

BFTN claim the test will ‘detect net neutrality violations’. My test revealed nothing, but even the best case example presented by BFTN falls a long way short of proving if any regulation had been broken. The most generous analysis is that the test will indicate if there is congestion. BFTN repeatedly conflates congestion with throttling, as if deliberate throttling is the only conceivable cause of congestion. They gave the game away when making the following observation.

One test once won’t say anything conclusive (the problem could be local to your home connection, or a brief issue, etc.). But many tests taken together produce a lot of data, which over time begins to expose systemic problems and Net Neutrality violations.

Well, it would take a lot of data to conclusively decide that congestion was due to systemic problems and not lots of separate ‘brief issues’. I cannot imagine myself, or most people, repeatedly hammering away at this button in the hope of generating enough data to distinguish between the two.

Here lies the murky thinking behind the oversimplified arguments for net neutrality. Poor performance is conflated with conspiracy, and conspiracy means rules are being broken. But being unhappy with repeated poor service from an ISP is not the same as having real evidence that ISPs have intentionally broken rules. Real businesses make mistakes, suffer from miscalculations, and generate inaccurate forecasts. Good luck to BFTN if they think they have a method that will conclusively distinguish between occasional throttling and a situation where occasional heavy use by customers exacerbates problems caused by poor planning. If they could do that, the best approach would be to sell their network capacity planning solution to the ISPs, instead of running to the regulator and demanding action be taken. Regulators are just as lousy at predicting network capacity requirements as everyone else, so threatening companies with punishment if they do not perform well enough is as hollow as promising that a speed test can distinguish between congestion caused by the provider and congestion caused by lots of customers using the same service at the same time.

What really strikes me as odd about this data gathering program is that anybody with sufficient resources could do a reasonable job of independently testing for congestion, without needing to appeal to random individuals to click buttons wherever they are. If the sample is driven by the response of individuals, it will be skewed. The people who click will be the ones who found out about the test, were most aware of the politics of net neutrality, and are probably unhappy with their ISP. No data will be gathered from customers who are satisfied or indifferent. A better way to independently assess network congestion would ape what telcos do in other domains, by setting up lots of test nodes and having them repeatedly contact each other. The software could be designed to run on high-end computers dedicated to the task of running repeated tests, or it could be distributed to routinely run on lots of PCs, when they are otherwise idle. Such test programs could easily be executed by companies like Google, who have vast amounts of money and access to computers everywhere. Google also has the political reasons to perform these tests – it is clearly in their interest to promote net neutrality. So why is the test being offered by a bunch of non-profit social justice warriors called Battle for the Net, when Google could easily do a lot more testing for less than a thousandth of a percent of Google’s annual profits?

I think one potentially upsetting answer is that the warriors have become the pawns of Google. If you dig into who collects the data from BFTN’s internet health test, then you find it is M-Lab. And when you dig into who M-Lab is, you find it is a collection of think tanks, academics, and Google. In other words, it is a lot of people with little money, and Google. Here is what M-Lab says about itself:

M-Lab was founded by the New America Foundation’s Open Technology Institute (OTI), the PlanetLab Consortium, Google Inc. and academic researchers.

That sounds like quite a few other people, and Google. And then you dig further. Google is the only organization with more than one seat on the M-Lab steering committee. And the New America Foundation is chaired by Eric Schmidt, Executive Chairman of Google. And Google is one of only three charter members in the PlanetLab Consortium, contributing a huge chunk of PlanetLab’s funding and holding a permanent seat on its steering committee. So, in other words, M-Lab was really founded by: Google plus others; Google plus others; Google; and others.

Forgive me if I start to become slightly suspicious of the extent of Google’s influence over M-Lab. Americans hear plenty about the supposed influence that the American cable industry has over some American politicians. It seems odd to me that Google cropped up again and again when doing detailed research for this article. They are bankrolling the net neutrality campaign advocated by Battle for the Net, even though you will not find Google’s name mentioned once on BFTN’s website.

How useful is M-Lab’s testing? According to BFTN, the testing will stop naughty businesses from breaking net neutrality rules. But when M-Lab wrote a report saying that ‘harm’ had been caused to internet consumers, this was the focus of their criticism:

Observed performance degradation was nearly always diurnal, such that performance for
access ISP customers was significantly worse during peak use hours, defined by the Federal
Communications Commission (FCC) as the hours between 7pm and 11pm local time.

This allows us to conclude that congestion and under-provisioning were causal factors in
the observed degradation symptoms.

Well, duh. Who knew that congestion might occur when most people are making most use of the internet? Do we really need objective measurement to point out the obvious? Next they will be telling us that traffic jams occur during rush hour.

Notice that M-Lab’s conclusion is entirely circular. According to them, if congestion occurs, then not having enough bandwidth is the reason why we suffer the symptoms of congestion. That is akin to a tautology. But what does this have to do with neutrality? If a network is congested because demand is greater than supply, then neutrality has no bearing on that result. Neutrality – the decision not to prioritize any form of traffic over any other – is not a solution to problems caused by demand exceeding supply. For all the fuss made about enforcing laws, a law to impose neutrality is not an order to give greater supply for the same price. But the solution to congestion, without any infringement of neutrality, can only be delivered by increasing the supply.

At this point, I want to find fault with M-Lab’s definition of network degradation, which follows.

We define “degradation” as a drop in download throughput, an increase in round trip time, or an increase in packet retransmission rate, measured against a respective baseline of historical performance on a given Access ISP/Transit ISP pair.

On one hand, this is a perfectly straightforward working definition suited to how ordinary people rate the services they receive in real life. ‘Degradation’ means ‘worse than before’. On the other hand, this definition would be useless for enforcing a rule. If this was the basis of enforcing a rule about internet services, ISPs would have an incentive to only make small improvements in their service. They may be forced to make their service better – but will have good reason to do no more. Woe betide any ISP that invests heavily in infrastructure and greatly improves performance over a short period of time. They will receive no extra thanks, and probably no more revenue, but they will definitely be at greater risk because of the punishment they will received if they allow performance to ‘degrade’ from the new standard they have set for themselves. Per the over-simplistic logic advocated by M-Lab, it would be better if ISPs improve service by only 1 percent each year, then for an ISP to improve its service by 10 percent in one year, then suffer a ‘degradation’ of 1 percent the next year.

There is another important flaw with the argument these tests will be used to enforce net neutrality rules. Even M-Lab’s report admits that if they found a congestion problem, they would not know who to blame:

…it is important to note that we cannot determine which actors or actions are “responsible” for observed degradation

Whoop-de-doo for law enforcement that tells you the law was broken but cannot say who should be punished.

We cannot tell whether any particular ISP between the user and a measurement point is “at fault,” what the contractual agreement between ISPs did or did not dictate vis-a-vis interconnection, or whether specific network modification was done to alleviate or magnify a given incident. Similarly, we cannot identify the precise cause of performance problems (e.g. a broken router) in a path between a client on a given Access ISP and a M-Lab measurement point, although we take steps to narrow the range of possible causes. Our data shows that traffic from specific Access ISP customers across interconnections with specific Transit ISPs experienced degraded performance, and that this degradation forms a pattern wherever specific Access ISPs and Transit ISPs exchange traffic. Speculating beyond that is not within the scope of this report.

Hmmm… so for all their measurement, M-Lab really cannot tell you much, other than whether an internet service was faster or slower than usual. The cause might be accidental, or it might not. It might have something to do with a contract, or not. The ISPs might have tried to reduce congestion, or they might have tried to make it worse. In short: they have no clue what is going on.

Notice was has happened here. M-Lab, a straight-laced front for Google, has no idea if neutrality has been violated. It can tell if congestion occurs, but not why, or who is to blame. This Google front then offers its ‘internet health test’ to advocacy groups, who promise the test will do the following.

The Internet Health Test checks your connection for signs of any degradation at all. With enough people taking part around the country every day, we can make sure that ISPs don’t get away with breaking the rules.

We have leaped from a situation where the people who collect the data say it cannot be used to tell if a neutrality rule was violated, or even who is to blame for the congestion they observe, to another group of people promising the same data will be used to make sure that ISPs do not break neutrality rules. Their mistake is remarkably useful to Google. They have found a way to encourage a bunch of independent people to make a lot of public claims that Google knows it cannot make because they are not literally true.

Why does Google need organizations like BFTN to encourage people to perform its network congestion tests? It is estimated that 7.3 million Chromebooks will be sold this year. Chromebooks run Google’s operating system, but they do not come with hard drives, meaning they primarily run Google’s services over the internet. That is a lot of computers that will transfer a lot of data over the internet every day. Would it really be so difficult to co-opt some useful data on network congestion from all these networked devices?

Going further, why does Google involve the public in this testing at all? An organization with the enormous resources of Google could just do the testing, without needing to engage the public. Perhaps the reason Google does not use Chromebooks for testing is because it wants to respect customer privacy. But why bother asking any ordinary people to engage in these internet health tests when Google can comfortably afford to drive their own internet-connected wi-fi snooping vehicles down every road in the country?

Why bother engaging with the public, getting them to click a button and share their data… oh. I think I worked out what is going on here. The data might highlight where networks suffer congestion. But it will definitely highlight who supports Google’s political interests and where those people are.

If this data was only going to a bunch of left-leaning political do-gooders, then I would be relaxed about how much harm could be done by gathering this data. Google, on the other hand, are very good at analyzing data. And whilst Google’s motto is ‘do no evil’, Google knows plenty about gathering and using private data for all sorts of dubious purposes.

In conclusion, I am amazed at the insular and blinkered debate about net neutrality in the USA. Some people say they can objectively test for violations of net neutrality when they obviously cannot. Google hovers in the background, and nobody asks why this huge internet business takes takes such an active interest in minimizing the costs of the internet to big business users. Data is collected for the ostensible purpose of analyzing networks, but the way the data is collected means it is more useful for analyzing people. And then the whole thing is presented like the neutrality of the internet can be decided solely by US law and regulation, even though the internet is an international network.

It is true that a lot of internet infrastructure is currently located in the US; this website is hosted on an American server. However, I think this imbalance in distribution is a temporary skew that will even out over time, in the same way that Europeans used to have many more mobile phones than Americans, but Americans have somewhat caught up. American-based entertainment and information businesses will want to serve content from the US, to their US customers, and hence they will be keen for a regulatory framework which gives them the best possible infrastructure at minimal cost. However, it is idiotic to advertise to the world that you have an ‘internet health test’ when the only purpose of the test is to influence American rules and regulations. And whilst low-cost infrastructure might suit American businesses in the short term, neither they nor their customers will be benefit if ill-judged regulations depress infrastructure investment, because network companies are burdened with increasing costs whilst having no opportunity to increase their revenues.

I often use the analogy of the railways when discussing how the internet will change over a period of decades. In different countries around the world, the first railway tracks were typically laid by lots of private companies. Those companies progressively merged to cut costs, creating oligopolies and monopolies that increasingly came under state control. The same will happen to the internet, and the net neutrality debate is partly an anticipation of a wider public policy debate about state control and ownership. The analogy also highlights how the mighty can fall. The US was once crossed by railways, that were built during a frenzy of private investment. These lines of communication were crucial to realizing enormous growth in the American economy. Today, American railways are a shadow of their former glory, and many nations boast superior rail networks. We can debate whether this is due to a failure of the private sector or a failure of the state to intervene as it should, but we can be certain of one thing. A public good like a network will not serve the majority of customers well if it is run for the benefit of one customer above all others.

When it comes to the health of the internet, a health warning should be applied whenever Google seeks to influence the debate. They are an extremely wealthy and powerful corporation that occupies a monopolistic position in its market. Even if Google is right on some policy matters, people should be wary of siding with them. They should be even more wary when Google is hiding behind other organizations, who are free to make wild and inaccurate claims that Google cannot make. If an ISP wins a court case to shut down the overblown rhetoric of a little-known political lobbying organization, the ISP will still lose in the court of public opinion. But if an ISP won the same court case against Google, they would deal a severe blow to their political opponents.

In this instance, I believe the average person clicking on this ‘internet health test’ would not appreciate Google’s role in this testing program, or how their data might be used by Google. Google may advocate neutrality, but they are not neutral. If they want independent measures of network congestion, they should spend the money on doing it for themselves, without misleading the public about what M-Lab can sensibly hope to achieve with a scatter-gun approach to testing. Everything about their testing approach is suspicious, and my suspicions grew worse when faced with the challenge of investigating M-Lab’s finances. I am far from convinced that the ‘internet health test’ will deliver useful results, but I am sure that Google’s involvement is selfish at best, and possibly a means to gather politically useful data about private individuals, under the cover of a false pretext.

Eric Priezkalns
Eric Priezkalns
Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), a global association of professionals working in risk management and business assurance for communications providers.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press. He is a qualified chartered accountant, with degrees in information systems, and in mathematics and philosophy.