The government ordering Apple to break its encryption is stupid, counter-productive and unworkable
For the love of god, ask the computer people if your plan might actually work first.
The British constitution has many strange traditions. We open Parliament with a knock on the door by an official known as Black Rod. When a monarch is crowned, they are anointed with oils while hidden behind a curtain. And every few years, we hold a ritual debate about whether the government can force tech companies to break their end-to-end encryption.
It’s a solemn tradition, and you know how it goes: The government or security sources outline the pressing need to access the messages or cloud storage of terrorists and child abusers. Then the plan is revealed to be laughably unworkable, and finally, as per tradition, there’s an embarrassing climbdown and the status quo persists once again.
Anyway, the reason I mention this is that it’s that time again.
According to the Washington Post, the UK government has issued an “undisclosed order” to Apple, obliging the company to hand the government “blanket capability to view fully encrypted material, not merely assistance in cracking a specific account”.
As the Post notes, this is an unprecedented request in a western democracy, and something that Apple does not provide to any other democracy – even the United States.1 In fact, the only country where Apple does make this concession is China, for the obvious reason that it has an unavoidable need to remain in the good graces of the country where it manufactures its devices and where there are literally a billion possible customers.
So it’s genuinely quite a shocking thing to see that the government has demanded this capability. Both in terms of the opaque way it has happened – the existence of the “Technical Capability Notice” was only revealed by a leak2 – and because nobody in the government appears to have asked the opinion of anyone who knows about computers first.
This is because the idea of forcing Apple to break encryption is a terrible idea in principle, utterly stupid in practice, and completely unworkable in reality.
Subscribe (for free) to get more takes on politics, policy, tech and more, direct to your inbox!
End-to-end stupidity
The backstory to this request is the passage of the 2016 Investigatory Powers Act, which massively increased the government’s surveillance powers, legally legitimated mass data collection of the sort that Edward Snowden exposed, and weakened digital privacy rights.
When the law was initially proposed it would have obliged tech firms to build backdoors into their systems that used end-to-end (E2E) encryption – the form of cryptography used by apps like WhatsApp and Apple’s iMessage, that makes it mathematically impossible for anyone other than the sender or receiver to see the contents of messages.
With E2E, not even Meta (aka Facebook, which owns WhatsApp) or Apple employees can read what you’ve been texting on their respective platforms. And it’s a technology that is now used routinely in many places where data is sensitive, such as Zoom, Facebook Messenger, Google Meet, and FaceTime video calls, and many others – in addition to the major messaging apps.3
In the case of this new order, it specifically relates to Apple’s “Advanced Data Protection” – which similarly E2E encrypts iCloud backups of your iPhone – including your photos, notes, and a bunch of other things that get stored in the cloud.
So you can see in an abstract sense why the government wants access – these are places where Bad Things can happen, and without encryption it would be easier for the police and security services to stop them from happening.
However, as tech types have to argue every time proposals to break encryption are made, building in backdoors would be an insane thing to do.
The problem is that you can’t selectively break encryption for only the good guys. Any backdoor built into iCloud wouldn’t just weaken encryption for the baddies – it would weaken it for everyone and leave literally tens of millions of people and billions of messages, photos and other digital documents more vulnerable to hackers, cyberattacks and other modern digital nasties. It would undermine a fundamental building block of how our modern world actually works.
This is because whether you like it or not, breaking encryption is basically a binary choice: A message is either end-to-end encrypted, or it isn’t. There is no magic form of words, written by the smartest lawyers and legislators that can get around this problem. It’s just how cryptographic maths and logic work.4
So E2E encryption is an incredibly good thing for keeping our data safe. And don’t just take my word for it – if you click here or here you can see end-to-end encryption being recommended by, er, the government’s National Cyber Security Centre.
However, so far I’ve only made the practical argument in favour of E2E. There’s also an argument from pure principle, that it would be unwise and unprecedented to give the state unrestricted access to the enormous swathes of our lives that are now mediated through digital platforms.5
This is all why, in 2016, when backdoors were first proposed, after the traditional arguments back and forth were performed, the law was eventually watered down before Parliament voted it into law. The ‘compromise’ to get it through was that instead of the text of the law demanding tech firms create these sorts of backdoors, the law instead hands the government the power to theoretically ask tech firms to break their encryption if at some future point they believe it necessary.
At the time, and until now, the encryption status quo has remained. Essentially, both sides have pretended that yes, conceivably the government could issue such an order – while knowing that in practice, such a thing would be functionally impossible and an insane idea because of all of the problems described above.
But I guess someone behind the scenes must have missed a memo – as now the order has actually been made, and it has kickstarted another doomed attempt to break a major platform’s E2E encryption.
Though if I had to guess, I’m pretty confident that this will end the same way it does every other time, because of the trade-off involved.
I mean, on the one hand, encryption makes it harder to investigate criminals. But on the other, removing encryption undermines everyone’s personal digital security, and our national security collectively.
Frankly it’s absurd that we (correctly) worry about absolutely catastrophic hacks by hostile powers, while simultaneously proposing that we weaken our most basic communications infrastructure.
Or perhaps I’m wrong, and politicians are completely comfortable with the idea that next time there’s an embarrassing leak of their group chats that the leaker isn’t a disgruntled leadership rival, but Russian military intelligence or the Chinese Communist Party?
A question of growth
If you’ve followed the encryption breaking debate before, then much of the above will already be familiar to you. Like debating the existence of God, or whether Donald Trump is smart, fundamentally every argument for yes or no has been articulated before, to the point where everyone is just going through the motions.
But there is something different this time, I think – and that’s the context.
If we know one thing about the current government it is that the overriding political priority is to kickstart economic growth.
Over the last few months, we’ve seen Keir Starmer and his cabinet focus mostly on regulatory issues as blockers on growth – hence announcements on loosening the rules on everything from nuclear power, to the planning system.
The hope is that by making Britain a more attractive place to do business, it can invite more international investment from big companies that can create high-skilled jobs, pay taxes and grow the economy.
That’s why the Prime Minister has spent much of his premiership trying to butter up big businesses like, say… Apple.
Now I don’t know how Apple CEO Tim Cook will react to the order to break encryption, but I’d be surprised if his company doesn’t fight it every step of the way – because breaking encryption would make Apple’s products worse, by making them less secure.
Over the last decade, Apple has made privacy and security a core part of its brand – because it knows that if people are going to trust their phones to store every intimate detail about their lives, they need to be confident that the data stored on their devices won’t be compromised, whether by criminals, hackers or governments.
That’s why since its launch in 2007, the iPhone has become an increasingly locked down experience. From the beginning, if an app wanted to use your camera or access your photo gallery, you would have to explicitly grant it permission. And over the last decade, Apple has doubled-down on privacy, promising to keep your data encrypted, and only performing activities like AI photo analysis on the device itself, without sending your photos to the cloud for processing.
In my view, this is hugely laudable, but the company does not do it for principled reasons. There is a cold business logic to it, as unlike its competitors Apple does not have a business model predicated on selling advertising, like Google does, or on the network effects of a “social graph” like Facebook. So Apple’s privacy and security commitments are a significant competitive advantage.6
So faced with a government demand to essentially crack open everyone’s phones? Of course it will fight this as it has a strong commercial incentive to do so.
There is evidence of how far Apple will go too: Back in 2015, the company fought tooth and nail against an FBI demand to crack open the phone of the suspected San Bernardino terrorists – and in the end, the company won, and it wasn’t forced to compromise the phone.7
And this gets me back to the current government’s priorities, My point is this: Aside from all of the arguments above, what are tech firms like Apple going to take away from this order, when evaluating Britain as a place to do business? Will they want to compromise the integrity of their software and their brands, and make themselves more vulnerable to attack? If the government wants to attract high-tech business… this seems like a weird way to do it.
Think different
Maybe this time will be different. Perhaps my assumptions about the inevitability of a climbdown are wrong. Perhaps Apple will judge that, like China, selling iPhones to sixty million Britons and keeping our government happy is more important than consistent principles – or that it is at least worth the trade-off against stronger protections for its user data.
But somehow, I doubt it.
The text of the law, and the technical notice code of practice both describe how companies are only obliged to crack open their encryption if it is “practicable” and “technically feasible” to do so.
Now, to be clear I’m not a lawyer, nor do I know anything about legal arguments – but this seems to me like some pretty significant wiggle-room. For example, is it “practicable” if following the order would compromise millions of peoples’ data? I don’t know.
But if Apple’s E2E encryption is compromised and the storied tradition is finally broken, then it still won’t be a ‘win’ for the government. It will be a disaster for Britain’s digital security, and a blaring siren emoji to tech firms in the struggle for growth.
So… let’s not do anything stupid? And perhaps next time before making crazy demands, it would be sensible for policymakers to shout down the corridor, and ask the IT department if what they want to do is really a good idea.
Subscribe (for free!) to get more takes on policy, politics, tech and This Sort Of Thing direct to your inbox!
This footnote is to acknowledge your very clever and funny quip about America no longer being a democracy, because of Trump.
And the government has refused to confirm or deny its existence of the order. Democracy!
In some of these apps, encryption is (for some reason) not enabled by default, but the option is there.
If you want a tortured analogy, it reminds me a little of the Brexit debate over the Irish border. Despite the years of bluster and bullshit, fundamentally it had to be drawn either on the Island of Ireland, or in the Irish Sea. And though the Windsor Framework “solved” the issue by sounding like a compromise – all it really did was draw the border in the Irish sea, and handed politicians the theoretical, but practically impossible power to veto new European rules. And if the “break” ever was pulled, it would create a diplomatic and political shitstorm, and could ultimately result in result in severe retaliatory action from the EU. So in reality, the British government is never going to allow the Northern Irish assembly to use it.
Similarly, even if you trust our government to access data responsibly – such a breaking of encryption would hand every nightmare dictatorship a permission structure to do the same thing.
Apple has also driven improvements in the security of the rest of the tech industry. The reason Android has similar encryption/privacy features, even though Google has stronger incentive to keep user data cracked open so that it can better target advertising, is because it needs to keep pace with Apple.
In the end the FBI claims to have obtained access via third party exploit. In other words, a hacking tool that took advantage of a bug in Apple’s software – one that Apple eventually patched once it learned about it. So in this narrow case, the FBI appears to have got what they want, but crucially Apple didn’t give up their principle, and the dispute only ended because of basically an accidental coding screw-up in the iPhone OS.
Great piece. Spot on!
I agree with your arguments. In addition personal banking would not be safe.
My one worry, is that if the Chinese have a way in, what's to stop them using that advantage in espionage?