Too Much Privacy is also a Risk

In his post entitled ‘An ISP That Wants Privacy to be a USP‘, Eric Priezkalns reviewed the potential for Nicholas Merrill and the Calyx Institute to launch an ISP which guarantees unassailable privacy for its customers. Rob Chapman responds by questioning where to draw the line when it comes to the privacy of communications.

Ah yes – we all like our privacy, but what of our safety? It’s a difficult balancing act. I’ve been subject, many years ago now, to identity theft and resent the number of CCTV cameras which effectively track my movements day-to-day. However, when I think about the potential for threats which have been, can be and will be eliminated, I can’t complain too heavily.

We’re all, at least, fairly aware about the example of Google with the vans, aero and satellite shots and the public distain it’s been met with. However, does that example speak more about intent, use, disclosure or public understanding? Personally, I think that much of that example can be put down to a balance of ignorance and propaganda. Even if we haven’t known, and haven’t necessarily understood, what Google have and will use the information for, we have known for some time that it’s been gathered.

Though limited in understanding as to the complete solution and infrastructure to be put in place, I’m inclined to say that Mr Merrill’s plans are short-sighted or meant just to make a large point. Aside from the assurance challenges which this would pose – meaning a flat pricing model and the need of massive network capacity for resiliency – there remains the issue of the greater good.

Consider the old axiom which, for the most part, stands firm… if you do nothing wrong, you have nothing to hide. This remains an ideology when talking about governments and, for the vast majority of the democratic world, it remains our right to expect our governments to act responsibly and with sensitivity for our civil liberties. I know that’s a pretty rosy view and how things should be, but we have to hope for the best. Also, we should bear in mind that much of the surveillance, be it audio-visual or data related, is not actively used. Just because I’ve been caught on camera from leaving my house to getting to work, does not mean that I’ve been actively watched or tracked.

Then we come to the potential, commercial benefits which can be realised by ISPs who don’t treat data with enough sensitivity. There are significant challenges here, not just in juggling legal and regulatory handling of data, but also in these requirements and a natural urge to exploit the data which can give companies a lead in an ultra-competitive marketplace.

Let’s take the industry standard TR069 protocol which allows certain remote functionality for in-home devices. Many UK ISPs have been utilising this firmware and functionality for years for a variety of things from remote configuration of routers to improved diagnostic capabilities. This protocol allows for router parameter changes and the polling of data directly from routers. For the data, this is not just router specific information but can also pull information around state and parameter changes (i.e. passwords) and connected end-user devices (including MAC addresses). It also affords a means to change any of the user configurable parameters remotely, from the security settings to SSID and WiFi password.

We know the technology is out there in the market today, but the larger question remains around disclosure and commercial sensitivity to information which could be considered intrusive.

In terms of disclosure, as long as ISPs are not returning ‘intrusive’ data without first advising their customers of this, or at least offering an opt out mechanism, there should be a greatly reduced risk for customers feeling any kind of infringement. This does raise the question around what is reasonable to collect in the first place. Password information seems a clear winner to ensure it is excluded at all times, regardless of customer buy-in to any such scheme. However, other information is less of a risk if customers have not objected, such as the SSID and connected device information – but then the treatment and handling of such data should be careful with its use and visibility restricted. Other data poses no intrusion or threat and can easily be justified for use in aiding companies with in-home issues and service/diagnostics improvements.

We have the question about treatment, handling and use of the data. In the U.K., ISPs have an obligation to ensure that they follow DR&R regulations, which make communication related data which an IPS has processed available to the government upon lawful request. We also have Data Protection related regulation which means that the company must ensure proper, commercial handling of data. Then we have the question of companies wanting to utilise the data captured to enhance sales through targeted marketing and reduce churn. There is nothing particularly new or challenging in the scope of TR069 which will differ how ISPS already have to handle similar data (such as Radius logs). The real question for me is how do companies ensure that they don’t exploit information which they shouldn’t? This comes through a variety of areas: disclosure of the intended uses of both remote configuration and data, ensuring privacy policies and terms & conditions are updated accordingly, and that data intended purely for diagnostic support is held and managed away from the general business access and reporting*.

Of course, most of the general public have little idea as to exactly what information is captured as standard, or what could be. With only basic information as to who are the services users, I perceive the possibility of a fully encrypted service to be a serious risk and, whilst I admire and support the actions of Mr Merrill in his dealings with the FBI, I think that the pendulum has swung too far in the opposite direction to be healthy. All that said, most of this is based on assumption, and Eric makes a fine point that criminals already have access to public key cryptography, so where’s the harm offering it on a wider scale? I’ll be following things closely to see how they develop – and, of course, to get the chance to see the data.

*There is, of course, nothing within the Data Protection Act which precludes companies from retaining and using data longer than twelve months as long as there is no longer any association with the individual.

Rob Chapman
Rob Chapman
Rob is the Chief Operating Officer of the Risk & Assurance Group (RAG). He is responsible for the planning and execution of each RAG event. Rob's goal is to bring together professionals from across the industry and drive RAG's agenda forward.

Rob started working for RAG full time in 2018, having served as Chair on a voluntary basis for the previous four years.

Before joining RAG, Rob was a senior consultant at Cartesian. He has worked in revenue assurance and billing roles for TalkTalk, Verizon Business, Energis and Hutchinson 3G.

Related Articles

Get Our Weekly Newsletter by Email