I favor end-to-end encryption (E2EE) as the only foolproof way to ensure the privacy of everyone’s communications, but there is no stronger counterargument than the need to protect children by reviewing messages they receive online. The practical issues are best understood in the context of Facebook, who recently rebranded their company as Meta. Much fuss was made last year by users of Whatsapp, which is owned by Meta, who did not believe their privacy was being respected and hence asked friends and colleagues to switch en masse to alternatives like Signal and Telegram in the belief those platforms had superior encryption. Many of them believed falsehoods reported by unreliable journalists, but negative perceptions are still bad for business. On the other hand, Facebook is widely used by children, and hence offers a potential conduit for abusers to find and exploit victims. Facebook uses technology to monitor communications with children in order to identify possible instances of sexual abuse so they can be reported to child protection agencies. So when it was announced that Facebook planned to encrypt communications end-to-end, not least because of the furore over Whatsapp, there was also concern expressed by governments and members of the public that E2EE would protect sex predators. This is how the UK governments’ 2021 child sex abuse strategy characterized the issue.
In particular, Facebook’s proposals to apply end-to-end encryption to their messaging platforms by default presents significant challenges. While the Government supports strong encryption for protecting personal data, privacy and services such as banking, commerce and communications, we are concerned that end-to-end encryption has created significant and avoidable barriers to companies being able to identify and prevent illegal activity by child abusers. We will continue to stress that Facebook should only implement its proposals for end-to-end encryption if the safety of its users will not be reduced, including ensuring a means for law enforcement to obtain lawful access to the content of communications.
There are no half measures when it comes to encryption, even if the public believe technology should deliver impossible compromises. Either encryption is secure enough that nobody can break it, or else messages can be spied upon for reasons that may be bad as well as good. This can put governments in a difficult position. They may sincerely want to protect people by intercepting and analyzing electronic communications, but there will never be a perfect system that supports routine interception whilst guaranteeing the power to intercept is never misused. That means governments can expect to receive criticism from two directions at the same time. However, I find it odd that the UK government has adopted a novel approach to influencing how a privately owned business behaves without taking the responsibility that comes with passing a law to mandate how that company should behave. The ‘No Place to Hide’ campaign, which was launched in January, has only one explicit goal: to discourage internet businesses from implementing E2EE. Though not mentioned by name, it is obvious that Facebook is their focus. The following was taken from the new No Place to Hide website.
Right now, some social media companies can detect child sexual abuse material being shared on their platforms and report it to law enforcement. This plays an important part in stopping child sex abusers, and these companies deserve to be praised for this.
But some are planning to introduce end-to-end-encryption, which scrambles messages so that only the sender and receiver can see what is being shared.
This means they will no longer be able to detect child sexual abuse on their platforms and therefore won’t be able to report it.
The campaign looks as if it is led by a series of charities with a relevant interest in protecting children, and they are said to be represented on the campaign’s steering group. However, the campaign receives all its financial support from the UK government. A Freedom of Information request revealed that the UK government has budgeted GBP534,000 (USD724,000) to pay advertising agency M&C Saatchi to create the website and other advertising materials for the campaign.
This is a brazen attempt to use taxpayer’s money to influence public opinion in order to put pressure on Meta and other social media companies. It also seems that nobody is responsible for whether this British advertising campaign seeks to mislead the British public. The website goes on to say:
If these plans go ahead an estimated 14 million reports of suspected child sexual abuse online could be lost each year. This could have a catastrophic impact on child safety.
As you might expect from such an effective high-profile advertising agency, M&C Saatchi have successfully persuaded many British journalists to repeat the statistic about 14 million annual reports of child sexual abuse being lost. What few British journalists seem to have noticed is that the number of children in Britain is fewer than 13 million. Whilst it is possible that one child could be the subject of multiple reports within the same year, a statistic that implies more than one report of sexual abuse per child per year should have been handled with caution. Further examination shows the figure does not relate to the number of reports currently made to UK child protection agencies, but is actually driven by all the reports made to a US nonprofit that primarily seeks to protect the 73 million children in that country.
It is not clear why M&C Saatchi and a series of British charities thought they needed to repeat a statistic that is not specific to the UK in order to influence opinion in the UK, but it is clear that they expect to make it central to their emotive messaging. The graphic at the top of this article was copied from the No Place to Hide website. As you can see, it does gives the source of the statistic as NCMEC, the abbreviated name of the National Center for Missing & Exploited Children, a US nonprofit. The link given by No Place to Hide takes users to the a page on the NCMEC website but not to any specific mention of the statistic itself. I was unable to find the statistic reproduced anywhere on that website. However, other information on NCMEC’s website shows that the figure being quoted in the UK is similar to numbers they have stated about the number of reports they receive from US internet businesses.
As the nation’s clearinghouse for missing and exploited children issues, the National Center for Missing & Exploited Children (NCMEC) bears witness every day to how the internet is used to perpetuate online demand for graphic sexual abuse images of children…
Every day at NCMEC we analyze tens of thousands of reports of children who are raped and sexually abused while photos and videos are made… in 2018 alone we received over 18 million reports…
If end-to-end encryption is implemented without a solution in place to safeguard children, NCMEC estimates that more than half of its CyberTipline reports will vanish.
NCMEC are clear about how much they depend on internet businesses for the information they receive.
Internet companies were the most frequent reporter of online enticement to the CyberTipline (71%), followed distantly by parents/guardians (14%) and members of the general public unknown to the child victim (4%).
Though I could not find data about the importance of Facebook on NCMEC’s own website, there was more detailed NCMEC-sourced data in a 2020 UNICEF report that discusses child protection and E2EE.
In 2018, Facebook made 16.8 million reports to the US National Center for Missing Exploited Children (NCMEC) – more than 90 per cent of the 18.4 million total reports that year…
…The UK National Crime Agency estimates that, last year, NCMEC reporting from Facebook will have resulted in more than 2,500 arrests by UK law enforcement and almost 3,000 children safeguarded in the UK.
Our understanding is that much of this activity, which is critical to protecting children and fighting terrorism, will no longer be possible if Facebook implements its proposals as planned. NCMEC estimates that 70 per cent of Facebook’s reporting – 12 million reports globally – would be lost.
The mention of “12 million reports globally” indicates that the statistic repeated by No Place to Hide is also a global statistic, and is most likely higher to reflect the growth in internet usage between 2018 and today. So whilst the No Place to Hide campaign does not explicitly target Facebook, it is Facebook’s policies that the UK government is primarily seeking to influence, and whilst decisions about Facebook’s technology have ramifications for users in many countries, the UK government is bankrolling a campaign driven by statistics from a US business. Hiding behind a campaign front spares the UK government the need to address difficult questions, such as why the UK government is unable to offer any statistics about the number of British children who might be hurt as a result of a change in policy by Facebook. Meta reported in 2020 that the number of reports generated by Facebook that is relevant for the UK is actually in the range of 75,000.
NCMEC has an interest in influencing Facebook’s policy because of the extent to which they depend on their data. UNICEF’s report on child protection strikes a greater balance between the need to protect children from predators whilst also protecting their privacy too, and thus highlights the extent to which NCMEC is not a wholly unbiased advocate.
There is real concern that if digital communications platforms, including messaging apps, default to end-to-end encryption, almost all of the reports provided to NCMEC will cease. This is because it will not be technically possible even for law enforcement to access communications that are end-to-end encrypted, which means that they cannot use software to scan for illegal content. This will limit the evidence available to aid law enforcement investigations…
However, there are limitations to the extent to which reports made to NCMEC lead to actual cases of crimes against children being solved. When NCMEC receives cases not involving the US it is referred on to the relevant national law enforcement agency depending on the nationality and location of the child and offender. The response from national law enforcement agencies currently varies widely as a consequence of capacity and resource constraints. Even though it is hoped that this will change in the future with many countries upscaling their national response systems, including with UNICEF support, it remains a reality that capacity and resources to combat these crimes are extremely limited in many contexts. This means that a sustained stream of NCMEC reports will not necessarily lead to a safer environment for children until national law enforcement agencies are allocated sufficient resources to arrest and prosecute child sexual abuse offenders, including those operating in the digital environment…
…it is currently unclear how many investigations or arrests directly derive from NCMEC reports at the global level, or how many fewer would have been made with end-to-end encryption implemented. Attempts at collecting this data is currently underway by INTERPOL, but the lack of information makes it difficult to assess the potential drawbacks of implementing end-to-end encryption on Facebook Messenger specifically. The loss of reports to NCMEC has been one of the key arguments against implementing end-to-end encryption, but until more data is available, it is not possible to determine what implications this will actually have on law enforcement operations.
Research performed by Meta also suggests the total number of reports they forward to NCMEC does not give a good measure of the number of crimes that have occurred or the number of children put at risk.
To understand how and why people share child exploitative content on Facebook and Instagram, we conducted an in-depth analysis of the illegal child exploitative content we reported to the National Center for Missing and Exploited Children (NCMEC) in October and November of 2020. We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many.
…We worked with leading experts on child exploitation, including NCMEC, to develop a research-backed taxonomy to categorize a person’s apparent intent in sharing this content. Based on this taxonomy, we evaluated 150 accounts that we reported to NCMEC for uploading child exploitative content in July and August of 2020 and January 2021, and we estimate that more than 75% of these people did not exhibit malicious intent (i.e. did not intend to harm a child). Instead, they appeared to share for other reasons, such as outrage or in poor humor (i.e. a child’s genitals being bitten by an animal).
The UK government’s crude intervention in the debate over E2EE also prompted an interjection from Stephen Bonner, Executive Director for Innovation and Technology at the UK Information Commissioner’s Office. Bonner offered a more nuanced perspective that aligns with views expressed by UNICEF.
The discussion on end-to-end encryption use is too unbalanced to make a wise and informed choice. There is too much focus on the costs without also weighing up the significant benefits.
E2EE serves an important role both in safeguarding our privacy and online safety. It strengthens children’s online safety by not allowing criminals and abusers to send them harmful content or access their pictures or location…
E2EE is seen by some to hinder the clamp down on child abusers because it leaves law enforcers blind to harmful content. But having access to encrypted content is not the only way to catch abusers. Law enforcers have other methods such as listening to reports of those targeted, infiltrating the groups planning these offences, using evidence from convicted abusers and their systems to identify other offenders.
We are also seeing a range of other techniques and innovations available that can be used without accessing content to help stop abuse or catch those trying to harm. As an example, platforms are listening to teenagers’ reports and limiting search results for anyone attempting unwanted contact.
A skeptic might ask if the British government has created an ‘astroturf’ campaign that resembles a grass roots campaign just to focus the attention of the public on social media companies in order to distract from government failings. Such a campaign certainly helps the government to avoid parliamentary scrutiny about the adequacy of the resources they currently provide to enforce the law. If Meta produces 75,000 reports of abuse that are relevant to the UK each year, but only 2,500 arrests are made, there is room to question if every report has been adequately followed up. There is even more reason to question why the British public is told that 14 million child abuse reports depend on how they feel about social media, when that statistic is just as pertinent to children and parents in other countries. By concentrating on statistics that the government and NCMEC want to talk about, we are shepherded away from a more rounded discussion about the benefits of E2EE and the extent to which the UK government is doing all it should to protect children.