Two weeks ago, the Washington Post reported that the U.K. government had issued a secret order to Apple demanding that the company include a “backdoor” into the company’s end-to-end encrypted iCloud Backup feature. From the article:
The British government’s undisclosed order, issued last month, requires blanket capability to view fully encrypted material, not merely assistance in cracking a specific account, and has no known precedent in major democracies. Its application would mark a significant defeat for tech companies in their decades-long battle to avoid being wielded as government tools against their users, the people said, speaking under the condition of anonymity to discuss legally and politically sensitive issues.
That same report predicted that Apple would soon be disabling their end-to-end encrypted iCloud backup feature (called Advanced Data Protection) for all U.K. users. On Friday, this prediction was confirmed:
Apple’s decision to disable their encrypted cloud backup feature has triggered many reactions, including a few angry takes by Apple critics, accusing Apple of selling out its users:
With all this in mind, I think it’s time to take a sober look at what might really happening here. This will require some speculation and educated guesswork. But I think that exercise will be a lot more helpful to us if we want to find out what’s really going on.
Question 1: does Apple really care about encryption?
Encryption is a tool that protects user data by processing it using a key, so that only the holder of the appropriate key can read it. A variant called end-to-end encryption (E2EE) uses keys that only the user (or users) knows. The benefit of this approach is that data is protected from many threats that face centralized repositories: theft, cyber attacks, and even access by sophisticated state-sponsored attackers. One downside of this encryption is that it can also block governments and law enforcement agencies from accessing the same data.
Navigating this tradeoff has been a thorny problem for Apple. Nevertheless, Apple has mostly opted to err on the side of aggressive deployment of (end-to-end) encryption. For some examples:
- In 2008, the company began encrypting all iPhone internal data storage by default. This is why you can feel safe (about your data) if you ever leave your iPhone in a cab.
- In 2011, the company launched iMessage, a built-in messaging service with default end-to-end encryption for all users. This was the first widely-deployed end-to-end encrypted messaging service. Today these systems are recommended even by the FBI.
- In 2013, Apple launched iCloud Key Vault, which encrypts your backed-up passwords and browser history using encryption that even Apple can’t access.
Apple faced law enforcement backlash on each of these moves. But perhaps the most famous example of Apple’s aggressive stance on encryption occurred during the 2016 Apple v. FBI case, where the company actively fought U.S. government’s demands to bypass encryption mechanisms on an iPhone belonging to an alleged terrorist. Apple argued that satisfying the government’s demand would have required Apple to weaken encryption on all of the company’s phones. Tim Cook even took the unusual step of signing a public letter defending the company’s use of encryption:
I wouldn’t be telling you the truth if I failed to mention that Apple has also made some big mistakes. In 2021, the company announced a plan to implement client-side scanning of iCloud Photos to search for evidence of illicit material in private photo libraries. This would have opened the door for many different types of government-enforced data scanning, scanning that would work even if data was backed up in an end-to-end encrypted form. In that instance, technical experts quickly found flaws in Apple’s proposal and it was first paused, then completely abandoned in 2022.
This is not intended to be a hagiography for Apple. I’m simply pointing out that the company has, in the past, taken major public risks to deploy and promote encryption. Based on this history, I’m going to give Apple the benefit of the doubt and assume that the company is not racing to sell out its users.
Question 2: what was the U.K. really asking for?
Way back in 2016, the U.K. passed a bill called the Investigatory Powers Act, sometimes called the “Snooper’s Charter.” At the time the law was enacted, many critics argued that it could be used to secretly weaken security systems, potentially making them much more vulnerable to hacking.
This was due to a critical feature of the new law: it enables the U.K. government to issue secret “Technical Capability Notices” that can force a provider, such as Apple, to secretly change the operation of their system — for example, altering an end-to-end encrypted system so that Apple would be forced to hold a copy of the user’s key. With this modification in place, the U.K. government could then demand access to any user’s data on demand.
By far the most concerning part of the U.K. law is that it does not clearly distinguish between U.K. customers and non-U.K. customers, such as those of us in the U.S. or other European nations. Apple’s lawyers called this out in a 2024 filing to Parliament:

In the worst-case interpretation of the law, the U.K. might now be the arbiter of all cybersecurity defense measures globally. Her Majesty’s Government could effectively “cap” the amount of digital security that customers anywhere in the world can depend on, without users even knowing that cap was in place. This could expose vast amounts of data to state-sponsored attackers, such as the ones who recently compromised the entire U.S. telecom industry. Worse, because the U.K.’s Technical Capability Notices are secret, companies like Apple would be effectively forced to lie to their customers — convincing them that their devices are secure, when in fact they are not.
It goes without saying that this is a very dangerous road to start down.
Question 3: how might Apple respond to a broad global demand from the U.K.?
Let us imagine, hypothetically, that this worst-case demand is exactly what Apple is faced with. The U.K. government asks Apple to secretly modify their system for all users globally, so that it is no longer end-to-end encrypted anywhere in the world.
(And if you think about it practically: that flavor of demand seems almost unavoidable in practice. Even if you imagine that Apple is only being asked only to target users in the U.K., the company would either need to build this capability globally, or it would need to deploy a new version or “zone”1 for U.K. users that would work differently from the version for, say, U.S. users. From a technical perspective, this would be tantamount to admitting that the U.K.’s version is somehow operationally distinct from the U.S. version. That would invite reverse-engineers to ask very pointed questions and the secret would almost certainly be out.)
But if you’re Apple, you absolutely cannot entertain, or even engage with this possibility. The minute you engage with it, you’re dead. One single nation — the U.K. — becomes the governor of all of your security products, and will now dictate how they work globally. Worse, engage with this demand would open a hell-mouth of unfortunate possibilities. Do you tell China and Europe and the U.S. that you’ve given the U.K. a backdoor into their data? What if they object? What if they want one too?
There is nothing down that road but catastrophe.
So if you’re Apple and faced with this demand from the U.K., engaging with the demand is not really an option. You have a relatively small number of choices available to you. In order of increasing destructiveness:
- Hire a bunch of very expensive lawyers and hope you can convince the U.K. to back down.
- Shut down iCloud end-to-end encryption in the U.K. and hope that this renders the issue moot.
- ???
- Exit the U.K. market entirely.
If we can believe the reporting so far, I think it’s safe to say that Apple has almost certainly tried the legal route. I can’t even imagine what the secret court process in the U.K. looks like (does it involve wigs?) but if it’s anything like the U.S.’s FISA courts, I would tend to assume that it is unlikely to be a fair fight for a target company, particularly a foreign one.
In this model, Apple’s decision to disable end-to-end encrypted iCloud Backup means we have now reached Stage 2. U.K. users will no longer be able to sign up for Apple’s end-to-end encrypted backup as of February 21. (We aren’t told how existing users will be handled, but I imagine they’ll be forced to voluntarily downgrade to unencrypted service, or else lose their data.) Any request for a backdoor for U.K. users is now completely moot, because effectively the system no longer exists for U.K. users.
At this point I suppose it remains to see what happens next. Perhaps the U.K. government blinks, and relaxes its demands for access to Apple’s keys. In that case, I suppose this story will sink beneath the waves, and we’ll never hear anything about it ever again, at least until next time.
In another world, the U.K. government keeps pushing. If that happens, I imagine we’ll be hearing quite a bit more about this in the future.
Top photo due to Rian (Ree) Saunders.
Notes:
- Apple already deploys a separate “zone” for many of its iCloud security products in China. This is due to Chinese laws that mandate domestic hosting of Apple server hardware and keys. We have been assured by Apple (in various reporting) that Apple does not violate its end-to-end encryption for the Chinese government. The various people I’d expect to quit — if that claim was not true — all seem to be still working there.



The “big mistake” in question 1 regarding CSAM was a failure of marketing. When you upload photos to any digital service in the US, those photos will be scanned for CSAM. Period.
Apple do this, Google do this, Yahoo do this, hell Imgur does this.
Apple can’t provide seamless end-to-end encrypted filesystems in iCloud because it needs to upload those images to the server and be scanned by Apple on upload. To get around that, they designed a system where the scanning could be done on the device, and only sent “in the clear” for some poor sod to actually look and check if the device flagged them as “yes, this matches the crypytographic hash of a known CSAM image”.
That’s it.
So we have:
There seems to be a group of people who are pro-scanning and have decided that the best way to promote this view is to misinform people about U.S. law. I’m assuming you picked this up in good faith, so let me see if I can help clear this up.
There is no requirement whatsover to scan uploaded content for CSAM. If you think about that, legally forcing private providers to scan their customers’ data would be an obvious violation of the 4th amendment — you’re effectively deputizing private firms as government agents to search private papers. Any CSAM scanning that gets done is opt-in by providers; they are only legally obligated to *report* CSAM that they find, not to search.
Some providers opt to scan content for *distribution* (for example, posts to a public Facebook page) which makes tons of sense and can be legally required. A vastly smaller number have opted to scan private messages. Some others have split the difference: some cloud providers scan when you “share” a file with another user, but otherwise leave unshared files alone.
Clearly backed-up private photo repositories that are private to only to a single user aren’t being distributed or shared, so there’s no requirement to scan them. And once they’re end-to-end encrypted, this is doubly true. That’s why Apple does not scan iMessage attachments, Signal does not scan photo attachments, and today’s Advanced Data Protection does not scan iCloud Photos.
I can see how if you start with the false premise that scanning every uploaded file is required, the rest of your logic might make sense. Here in the US that is absolutely not the case.
At least it should still be possible to encrypt photos yourself, before uploading the files to cloud storage, to prevent any snooping for CSAM (or any other type of scanning your uploaded files for certain content).
Having such scans performed on the user’s device itself, automatically, would mean integrating these capabilities into their operating systems, and there will be no guarantee that the mere saving of a file will be intercepted by the OS to scan it’s contents before actually writing the data to the intended storage location as it would normally do – while keeping a shadow-list of hashes spewed by any or all scanning mechanism/technique that has been implemented – I doubt it would be possible prevent such an OS from forwarding these hashes to any party responsible for the distribution of hashes to be scanned for on systems worldwide: systems like MacOS, Windows, or Android will conveniently fail to communicate the full scope of (ex)changes included in the next mandatory security patch that will be rolled out later, most likely under the guise of addressing some critical zero-day exploit, and presto… instead of the need to be cautious and encrypt data before uploading it to online storage, it is now the workings of your own system that have become inherently untrustworthy.
Not to mention how quickly it will devolve into a bulky, bloated piece of shite that you will be fighting constantly as more and more bugs and exceptions to rules are introduced, guzzling memory and processing-cycles, while programmed to keep it’s crucial processes hidden or impossible to properly monitor for anyone using the system. I remember some Adobe-software got to a point where there were more processes running to monitor the user, than processes related to simply having the application perform the functions it was intended to provide. Suffice it to say that it provided quite a challenge when your system ground to halt, and you were basically left at the mercy of their own customer support to assist you in disabling what they considered to be anti-piracy measures – their efforts to prevent unauthorized users from running their software were a prime example of how security through obscurity can quickly become a huge liability when you get people with shit for brains to do the actual implementation of such code.
Only these days, systems generally have enough processing power to keep an eye on any user-activity using badly coded spyware for a while, before the systems performance will suffer enough for the user to notice, so you can expect crucial parts of such surveillance-software to be developed by one or two interns using some questionable AI-engine for code completion. By the time the shit makes it impossible to pinpoint the van, another intern or their kid will have no trouble in ascertaining that documentation of the project is severely lacking, and fixing some of the flaws will require rethinking and rebuilding from scratch.
How does this impact other service providers like Google who have very similar offerings to Apple?
Apple is choosing the smart way out. Let the UK user outrage do what their lawyers apparently could not. The major downside, of course, is that other nations could now demand that Apple similarly disable E2EE for users in their countries as well. Enter on-device applications to encrypt and hold the data at the software level so that the hardware level is moot. The cat and mouse game continues….
His Majesty’s Government…
Besides turning off encryption for backups there is the option of doing no backups.
Was this meant to be ironic, or have we really gotten to the point where we can no longer distinguish between coercion and free will?
More push from citizens on UK governments is needed. A
As this article correctly stated – Apple possibilities are limited. But I do not agree that leaving UK market is most harming. It will be best thing to do – because it will put UK citizens on the same boat as Apple and increase push on such evil laws like UK is producing. I’d say that UK government will face tough time to say their voters – for sake of our paranoid sniffing, we take Apple ecosystem from you. And yes, I’m aware at the same time that many people owns Apple shares and they will better look away and obey for value of their investment. And yes, I’m aware that Apple board will face hard times from shareholders because they prefer nothing more to seeing shares to grow.
This is not a technological issue. It is a societal issue.