Google Claims that Blocking Cookies Is Bad for Privacy

They keep passing laws in Europe, much to Google’s chagrin. Some of these laws are meant to defend the user’s right to privacy whilst accessing the internet. This includes giving users the right to block cookies. But according to Google, the European Union completely misunderstands the role of cookies, the small pieces of code that Google uses to collect as much data about you as possible. Last week Google launched their own ‘privacy initiative’ by saying:

…large scale blocking of cookies undermine (sic) people’s privacy…

Yes, they actually wrote that. Google not only pays six-figure salaries to goons who do not understand basic grammar, but they also reward them for defying logic. According to Google, not having codes on your computer to monitor everything you do increases the risk of your privacy being infringed. If you believe this is true, consider whether you agree with any of the following statements.

  • installing CCTV cameras in the town center makes it harder to watch people
  • automatic number plate recognition makes it more difficult to monitor the movement of cars
  • sharing personal data on a social network makes it less likely you will be impersonated
  • Google values your privacy more than its profits

Justin Schuh, Google’s Director of Chrome Engineering, should be ashamed of the Alice-in-Wonderland reasoning he presented in a post with the disingenuous title of “Building a more private web”. Here is the bizarre argument which concludes cookies are vital for privacy protection:

…large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.

In other words, if you do not make it easy for everyone to track your movements through cookies, then some people may use more complicated techniques to accomplish the same goal. Presumably this logic is also adopted by people who leave their front door unlocked because they do not want burglars to smash a window.

Fingerprinting a web user is a privacy invasion. So is forcing people to accept cookies. Both are wrong, and both can be combatted through controls over technology, whether they are adopted voluntarily by businesses or imposed by lawmakers.

After offering his fig leaf argument about fingerprinting, Schuh quickly shifted focus to the real reason for his new privacy initiative, which is defending the revenues of Google publishers in general.

…blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web. Many publishers have been able to continue to invest in freely accessible content because they can be confident that their advertising will fund their costs. If this funding is cut, we are concerned that we will see much less accessible content for everyone.

Who does Schuh think he is fooling? Aram Zucker-Scharff, who does research and development of advertising for The Washington Post, is certainly not fooled.

Advertising is one way to make money from delivering content, but it is not the only way to make money, as ably demonstrated by the rise in streaming television services where consumers pay a subscription instead of suffering ad breaks during their favorite programs. The evidence that publishers can continue to make money without relying on cookies is presented in a persuasive article by privacy academics Jonathan Mayer and Arvind Narayanan.

The more troubling aspect of Google’s argument is the belief that all users should be forced to sacrifice their privacy so some people can get content they like for free. Perhaps Google is unaware, but every country already has a government which can make everyone pay for things that only some people want. The last thing we need is Google taking everyone’s data so they can decide who will benefit as a result.

Starting with today’s announcements, we will work with the web community to develop new standards that advance privacy, while continuing to support free access to content.

In other words, Google wants new ‘privacy standards’ that are strictly limited in scope, so they do not threaten Google’s ability to collect data about users. To achieve their goal, Google intends to ‘work with’ rivals who have already done the opposite of what Google wants, because they already created browsers that block cookies. We shall have to see if ‘working with’ the community means bullying, bribing, or using an army of lobbyists and political henchmen to exert control over everybody who threatens Google’s advertising business.

We are following the web standards process and seeking industry feedback on our initial ideas for the Privacy Sandbox.

Put simply, it is possible to write code to block cookies and some people (not Google) have written that kind of code. So Google wants to outlaw that code by creating a ‘sandbox’ so businesses that want to track you with cookies can continue to track you with cookies.

Google’s initiative might work because they have always been adept at persuading the ill-informed to think what Google wants them to think. Some people will actually sacrifice their privacy because Google is saying they are protecting it. After all, Google played the same trick with net neutrality. Many people still think net neutrality has something to do with freedom of speech instead of just being a method to lower Google’s costs by forcing all internet network infrastructure costs to be carried by ISPs and their customers.

So please give Google some feedback on their proposals. You could begin by switching from the Chrome browser to Firefox and its enhanced tracking protection, or to Safari and its intelligent tracking prevention.

Eric Priezkalns
Eric Priezkalns
Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), a global association of professionals working in risk management and business assurance for communications providers.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press. He is a qualified chartered accountant, with degrees in information systems, and in mathematics and philosophy.