There have been a flurry of headlines in the last two days, after the Times reported that “WhatsApp, Facebook and other social media platforms will be forced to disclose encrypted messages” to law enforcement agencies, albeit with a focus on “suspected terrorists, pedophiles and other serious criminals.” The report suggested that an imminent treaty between the U.S. and U.K. would break the encrypted messaging lock, finally giving law enforcement what they crave—access to message content.
Is this true? Because, if so, it would be one of the most unpopular assaults on widespread user privacy in recent times. Well, it’s not that simple. These reports are almost certainly misleading, mixing up data sharing agreements with government mandated backdoors—which are clearly entirely separate. Yes, there are discussions between lawmakers to address messaging and other forms of encryption—but it would be highly unusual for such radical change to come first in the form of a treaty giving the U.K. access to U.S. data.
As I’ve reported before, crime fighting and intelligence agencies in the U.S. and U.K. have a major issue with “going dark,” by which they mean an inability to penetrate the end-to-end encrypted messaging platforms, even with a court warrant in hand. The platforms defend the position they have taken—any backdoor designed for the good guys will inevitably be exploited by the bad guys, a vulnerability is a vulnerability. And anyway, they say, who gets to decide who’s good and who’s bad. If the backdoor exists for the U.S., then what about Russia or China or the Middle East?
Back in July, U.K. Home Secretary (interior minister) Priti Patel accused Facebook of frustrating the fight against terrorists and child abusers, with its plans to extend the end-to-end encryption in use by WhatsApp across the rest of its platform. “Where systems are deliberately designed using end-to-end encryption, which prevents any form of access to content,” she wrote in the Daily Telegraph, “no matter what crimes that may enable, we must act.”
And Ms Patel is at the heart of these latest reports. According to the Times, Priti Patel “will sign an agreement next month that compels U.S. social media companies to hand over information to the police, security services and prosecutors. The data access agreement, which marks the culmination of four years of intense lobbying by the UK, is seen by Downing Street as an essential tool in the fight against terrorism and sexual abuse.”
Encrypted messaging information can be the metadata associated with the message—the platforms have that. But to disclose the content requires a complete rethink of the ways in which messages are sent from one individual to another. And where such encryption has been broken—recent nation-state hacks have reportedly accessed this content—it has been a compromise of the device not the platform.
To jump from this to the conclusion that U.S. platforms such as Facebook/WhatsApp—along with Signal and Wickr—will break their encryption security because of a U.S./U.K. data sharing treaty misses the point. The platforms capture metadata, essentially who messages who, when, how often, and such data can be retrieved and supplied to law enforcement based on a legal request. Most of that data is captured by U.S. organisations. A data sharing agreement would give the U.K. the rights to request that data from the U.S., and vice versa, but it doesn’t in of itself extend the scope of the data, without a significant change in U.S. law.
“We oppose government attempts to build backdoors,” Facebook said in a statement published by Bloomberg, “because they would undermine the privacy and security of our users everywhere. Government policies like the Cloud Act allow for companies to provide available information when we receive valid legal requests and do not require companies to build back doors.”
Critically, the 2018 Clarifying Lawful Overseas Use of Data (CLOUD) Act does open the door to the U.S. sharing data with overseas governments, and it does give the authorities access to data whatever it might be stored, but it has no provision to mandate decryption of that data when it is encrypted by customers. And with end-to-end encryption, the data is customer-encrypted with customers holding the keys.
There are moves to extend the scope of the legal rights of government to mandate access to restricted information, but that’s a really big deal, and it will start with a huge political fight in the U.S. before it happens. Earlier this year it was reported that the U.S. government was exploring just such legislative options to prohibit forms of encryption that law enforcement can’t break. Since then we have seen U.S. Attorney General William Barr argue that technology companies must not stand in the way of backdoors being introduced onto their platforms.
All off which created a huge backlash in the technology and privacy sectors.
Patel made her July statements in the midst of a ‘Five Eyes’ meeting she was hosting in London, with representatives from U.S., U.K., Canadian, Australian and New Zealand intelligence agencies discussing measures to enable law enforcement to access to end-to-end encrypted platforms. Their argument isn that as the workload from terrorism, child endangerment and cross-border organized crime escalates, “we need to ensure that our law enforcement and security and intelligence agencies are able to gain lawful and exceptional access to the information they need.”
“Technology is moving fast, and privacy needs to move with it,” Joel Wallenstrom—the CEO of uber-secure messaging platform Wickr—told me after the U.S. discussions came to light. “These are all completely legitimate, understandable even predictable concerns coming from law enforcement and elsewhere.”
Maybe so. But the challenge for government is that there are no good options for resolving this. Privacy is an issue, as is data compromise. And any shift away from total security can be exploited. Patel cited the Ghost Protocol idea proposed by U.K. spy agency GCHQ as one option. This thought-piece called for “an extra end” in end-to-end encrypted messaging, allowing governments (when required) to listen in. But technology companies, privacy experts and human rights groups published an open response, claiming “it would introduce potential unintentional vulnerabilities, increase risks that communications systems could be abused or misused.”
“The ghost protocol idea has been proven over and over to be unsustainable,” Wallenstrom told me. “Deciding who gets access to this kind of [intercept] technology means we’re in the business of determining who’s good and who’s bad.” He also pointed out that removing privacy protections opens up content so the platforms themselves can “go snooping through user data.”
It seems this story about data sharing has led to stark implications, none of which were clear in the initial reporting. To be clear, for WhatsApp and other U.S. platforms to break their encryption would require major technical changes. They will fight hard against such moves. And there are other platforms, such as Telegram, that would fall outside the rules. If we were about to see a genuine breaking of messaging encryption, I would expect to see the technical proposals discussed—exactly as we saw with the “ghost protocol” idea. We have seen nothing of the sort yet.
Clearly no guarantees, but on balance encrypted message content still seems safe for the time being. If that changes it will be the technology story of the year so far.