Skip to main content

The only slippery slope with an exit

·1963 words·10 mins

Early last month, Apple announced they would start scanning pictures bound for their iCloud Photo Library for known child sexual abuse material (CSAM). It is not the first service provider to do so. Apple, however, is the first to scan pictures on their users’ devices, but only before and only if that picture is to be stored on their servers.

Since the announcement, much has been said and written about Apple’s plans1. In response, on September 3rd, Apple announced they would postpone the introduction of their CSAM-detection to give them more time to process the feedback it received. People, it seems, are uncomfortable with the knowledge of their devices searching their photos for known illicit material. Yet they should feel more uncomfortable with the alternative and its implication for future privacy.

The past #

It is 1996. Summer is almost over. You have returned from your vacation and handed in your 35 mm film for development by your local photographer. A couple of days later you return to retrieve the pictures. “They’ve turned out great,” she says as she hands you the envelope. You feel uneasy about this remark, but then realise she could not have developed the photos without looking at them. You take the developed pictures and the negatives back home.

The next day another person arrives at the photographer’s store. This time, she notices—to her horror— that the film contains CSAM. The photographer duly notifies law enforcement, thereby starting a process to help the victims and end the abuse. The owner is arrested. Later hearings reveal the person handed in the wrong camera roll by accident.

None of us feel any discomfort knowing this protocol is in place. Victims of child abuse suffer for the remainder of their lives under the weight of their traumatic experiences. This procedure goes a little way to curtailing the extent of that suffering. We expect people and institutions who are in position to pick up signals of any form of child abuse to take appropriate action.

A dystopian past #

It is 1996. Summer is almost over. You have returned from your vacation and handed in your 35 mm film for development at your local supermarket. A couple of days later you return to retrieve the pictures. “They’ve turned out great,” says the woman behind the counter as she hands you the envelope. “Ow,” she adds, “we took the liberty of developing your pictures twice. We will keep the second copy in our safe, should you ever lose yours.” “Thanks,” you reply, “that might come in handy!” and you step out the door.

Later you learn that supermarkets scan all archived pictures for illicit content every week. A slight feeling of unease takes hold of you. Then you realise you have nothing to hide, you sigh a breath of relief. As you lie in bed that night, you start wondering. “What if I ever upset one of the employees? They have access to safe. Won’t they be able to frame me?” Soon that thought is replaced by another: “I trust my government to decide what is illegal now, but what about the future?”

A year later the climate in your country takes a turn for the worse. You return to your supermarket and ask them to delete their copies. “Sure,” the man behind the counter says. “Just press this button here, and then the one over there to confirm”. You press the first button, then the next. As you let go, the man tells you: “Your photos have been destroyed. Thank you for your visit.” “Wait, is that all?” you ask, “I would like to see some proof!” The man replies: “I’m sorry, sir, all I can give you is our word. You will have to trust us.”

Twenty-four years later you are arrested. One of your pictures from 1996 saw you attending a convention of a, now illegal, political party.

The present #

Last year, Facebook reported over twenty million instances of CSAM according to the New York Times. Those cases may be very similar to our hypothetical perpetrator in 1996: people who upload CSAM either through ignorance or by accident.

Facebook is an example of a company that scans all content for illicit material. They have good reason to. The vast majority of their users would be disgusted, upset, traumatised if they discovered videos of beheadings or, indeed, CSAM in their feeds. Facebook, their users and the public share an interest in having all content scanned.

But… Facebook scans content on their servers, after they have left the device. This is akin to the supermarket from our dystopian past, where all pictures were archived and routinely checked. Those dystopias are at the heart of the “slippery slope” argument. By developing the technology, there is a risk of feature creep, the expansion of its functionality beyond what was considered acceptable at the time of its inception.

Many storage providers are incentivised to implement checks similar to Facebook’s. If they distribute content on their users’ behalf, they face the same risks.

How Apple’s CSAM-detection Works #

Apple’s proposed approach is different. Pictures are scanned only if and before they leave your device for your iCloud Photo Library. The scanning algorithm will be part of the operating system. If you decide to use Apple’s online service for photo storage, the algorithm will be activated and scan outgoing pictures. The algorithm is designed to detect known CSAM, that is, instances of CSAM that are confirmed by a duly authorised authority.

Before upload, every photo is scanned by the CSAM-detection algorithm. The algorithm generates an encrypted safety voucher that contains information about the match. But, as we will explain below, it is impossible to learn from a single voucher whether a match was found. Your device will then attach the voucher to the picture and send the result to Apple.

Safety vouchers are generated for all pictures. Thus, the presence of a safety voucher reveals nothing about the nature of a picture. Every picture has one!

On its own, the content of a single safety voucher is meaningless. It is mathematically impossible to infer from that safety voucher if it was attached to known CSAM. Only if a user accumulates enough safety vouchers indicative of CSAM does the information contained in those safety vouchers become available to Apple.

The algorithm running on a device, uses information about known CSAM, generated from material confirmed to be CSAM by a duly authorised entity. That information itself is no CSAM. It cannot be used to reconstruct CSAM. It cannot be used to detect innocent nudity of your own children. And if Apple wants to change the algorithm or the CSAM it can detect, they will need to release updates of their operating systems.

Implications for the future #

Even if a cloud storage provider does not scan their customers’ data for CSAM yet, future legislation may require them. The European commission is investigating laws that require online service providers to detect and report CSAM-material. If this is done on servers owned by the cloud provider, they must be able to access the content of the pictures.

Companies claim to encrypt your pictures. That is not a lie. They do. Typically, pictures are encrypted “during transit” and “at rest”. Encryption during transit ensures that only the intended recipient, i.e., the cloud provider, is able to access the content. Before a picture leaves your phone, the data is encrypted with a key that was agreed upon by both parties. The data arrives at its destination and is decrypted. Cloud providers may then encrypt your data again. This time using a key known to a select number of people. This allows them to enforce policies that limit access of customer data to authorised employees. This is known as encryption “at rest”. Pictures that are not in use remain encrypted. If the picture is processed for any reason, it is decrypted. This system is used my all popular cloud storage providers, including Apple.

Through “encryption at rest” they retain the capability to access your data. Through this capability they can add checks that comply with the law. Apple’s proposed scanning mechanism opens to the door to a more privacy preserving technique of “end-to-end” encryption. In this scheme, pictures are encrypted with a key that is unique and private to the user. No one else, not even the cloud provider, has access to this key. Once implemented, the cloud provider loses the capability to decrypt and hence process their customers’ data. As a result, they cannot scan content for CSAM on their servers.

Apple’s proposed mechanism detects CSAM before the data is encrypted and sent to their servers. Apple can thus guarantee to the authorities that when the data was uploaded, it was free of known CSAM (up to the threshold outlined in the previous paragraph). If a customer uses iCloud Photo Storage to store CSAM, Apple will be notified and can report them to law enforcement. They can do all this without requiring ever seeing a single bit of intelligible data.

This mechanism combines the convenience of digital photos with a sense of ownership and locality of their physical equivalents. If we stick a photo in a physical album, put the album on a shelf, that album and its photos remain in our home, accessible to nobody but us and those we grant physical access to it. The physical, local nature of the album makes it hard for third parties to study its contents. Digital data on the other hand is trivially copied, manipulated and analysed.

It is this scanning mechanism that will allow Apple to close the gap between physical and digital goods, while complying with the law. Without end-to-end encryption in place — and to be clear, it is currently not — Apple can technically process all photos and documents stored on their servers for any purpose. Now and in the indefinite future. Once Apple enables end-to-end encryption, they will lose this capability.

A possible future #

Suppose you trusted Apple to handle your data in a manner that aligns with your values, as you may have trusted your supermarket in the dystopian past. However, unlike that supermarket, Apple keeps its copy in a form that can only be deciphered by you and you only.

If the day comes that you lose trust in Apple and your values no longer align with Apple’s practices, you would not need to concern yourself with the copies that are on their servers. Those pictures are encrypted. Only you own the decryption keys. Apple cannot access, see, or infer anything about your pictures’ content. Once they have left your device, Apple cannot scan your pictures any more.

You could then stop using iCloud. You could stop updating your OS. You could block Apple’s devices from accessing their services on the network level. From the moment you take any of those measures, you will not be affected any longer. You may not want to use Apple’s devices or services going forward, but at least your past will not haunt you.

If, on the other hand, you were using another cloud provider, that relied on server side scanning, there is no escape. Your data is on their servers, they have the keys to your data. You may delete your data, decide to stop using the service, but there is no guarantee the data is not physically retained and scanned.

In the future, Apple, and any other cloud provider, may have no option but to scan your photos. Each cloud provider will have their system in place, all are subject to feature creep and “slippery slope” arguments. One slope has exit lanes.