Apple's "child safety" features, and the good old controversy.
According to Apple, the main issue is protecting children. Their new features are designed to prevent the spread of what is termed "CSAM" (or "child sexual abuse material," essentially a formal euphemism for child pornography) across their services. But according to the back-and-forth I've read from individuals on the Internet since the announcement, the main issue is user privacy. The controversy is about the nature and future implications of the collision between these two concerns. This topic deserves many more 45-minute videos, but my meagre aim today is to summarize and contribute observations I haven't seen mentioned.
A few facts for context, pertaining to the United States specifically. Child pornography is illegal. If online service providers come across it, they're supposed to report it to The National Center For Missing And Exploited Children, established by congress in 1984. Other services (Facebook and Google, for examples), have followed this requirement by automatically analyzing the content of the photos users store on their servers. The number of files reported so far is measurable in millions.
The number of reports from Apple to this point, however, totals several hundred. When it comes to user data, Craig Federighi has previously said on stage: "we just don't want to know," essentially rephrasing Apple's privacy-focused principle of data minimization. Apple hasn't been able to report offending photos and videos because their own services are designed not to allow them access.
Indeed, there's no government requirement for service providers to search for offending material, and in a new interview with Apple's Privacy head Erik Neuenschwander, Matthew Panzarino asked why Apple has finally chosen to start doing so. Erik said they've been looking at it for some time – evidently they "don't want to know," but they know they don't want this category of content to exist on their servers – and this is the first time they've felt their approach can accomplish both.
In short: they've developed a hashing process which intelligently performs a one-way conversion of a photograph to a piece of otherwise meaningless data, such that similar photos – even ones which have been slightly altered – produce the same blob of meaningless data when similarly converted. The data can't tell you anything about the original photo, but two pieces of identical data imply two originals with the same content. Only the The National Center For Missing And Exploited Children is allowed to have copies of the offending photos, from which a database of hashes is generated and compared to users' photos. Further stipulations: all this comparison computation only happens on users' devices, nothing is reported to Apple unless a certain number of matches is found, a human must verify any such report before Apple takes any action, and none of this happens in the first place for users who do not use iCloud Photos. This is the approach that's generated what's being called "controversy." Whatever may be controversial about it, I think it confirms that Apple "doesn't want to know." As Erik suggests with his answer to "why now," this entire approach would have seemed unrealistically futuristic a decade ago.
This "feature" is one of three. The second is Siri's new willingness to engage with users making search queries related to explicit material involving children. (I'm not sure it's clear which queries.) From what I've seen, this feature seems to have gone practically unnoticed by the most vocal privacy advocates – oddly enough, considering Siri is predicated on learning who you are for the purposes of assisting you generally.
The third – technologically separate from the others – pertains to the iMessage service, specifically for users who are children subject to parental controls. It's the iMessage service's ability to detect incoming or outgoing sexually explicit photos using a machine learning approach. When it's on (at the choice of the parent), such incoming photos are obscured, the user is warned about the potential content, informed that explicit photos can be used to "hurt" users, and told their parents will be notified if they choose to view it.
While privacy at large is the main contention, my first inclination is to consider how children are treated in arrangments like this. In a hypothetical situation where a suspicious stranger is trying to recruit an unwitting youth, the careful wording and balance of considerations seems appropriate to me. The child's account isn't automatically locked down, Apple doesn't learn anything at all, and ultimate control over the conversation remains with them. The parents are notified if they've chosen to be, but the child is informed whether and when, and can choose to back out of the situation beforehand (this is the nature of any "parental controls"-style protection). I hope users of all ages are responsible and Internet safety-savvy enough to roll their eyes when such warnings appear, and that any parents using these controls will have talked them through with their family beforehand, but not all children benefit from such knowledge or good family relations, and this sort of net seems like it may succeed as a line of defense against trouble for some.
On the subject of privacy at large, I'm encouraged by the sheer amount of conversation, which I think indicates people are alert to the slightest potential privacy infringement – hearing crickets would be worrisome in itself. My only real gripe is the obscuring of facts and the use of ambiguition as a tactic, which I'm disappointed to have seen from some of the first sources that weighed in.
One – I think the one that first brought this whole issue to my attention – was Edward Snowden, whose personal sacrifice and service in favour of public awareness of government and corporate spying I deeply respect. However, his initial tweet claims of the iCloud Photos hashing approach: "if it finds a hit, they call the cops." Reporting users possessing known child pornography for prosecution is an action Apple states their intent to take, but as mentioned earlier, it requires much more than the system "finding a hit." He also says "iOS will also tell your parents if you view a nude in iMessage." Again, that presumes a list of things he didn't mention. A few days later, he continues to tweet with the hashtag "#spyPhone", saying that if we "don't shut up," "we can win this." (Where's his campaign about "winning" against Facebook over the past ten years? I may well have missed that one, but I haven't noticed any sense of relative perspective to this issue since he raised it.)
India McKinney and Erica Pornoyaugust from the Electronic Frontier Foundation also share their opinion. I became aware of the EFF last winter, when their column on Facebook's campaign against Apple seemed in service of privacy and clarity. This article, contrastingly, places the claim right in the opening paragraph that "Apple is planning to build a backdoor" into its systems. The second paragraph seems, almost tellingly, to defend the use of the term: "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor." (On top of the unwarranted lack of a space in "back door," I'd love to read the authors' breakdown on how they decide whether to use a hyphen. At least it wasn't "back-door.")
So, what is a "back door" in this sense? From Oxford: "a feature or defect of a computer system that allows surreptitious unauthorized access to data."
Is this a feature? That's what Apple calls it. Is it part of a computer system? Yes. Is the access surreptitious? No, they're announcing it now, well in advance. Unauthorized? It's possible they won't add it to the terms of service, but I'd be staggered if they didn't. So, no, this isn't a back door. (You can find alternative definitions – some people prefer Wikipedia, which includes "typically covert," but any definition leaves you feeling like the EFF article's authors' lede is at least an exaggeration, this time undermining their credibility as people writing in the service of clarity.) Their final section about iMessage defends a similarly blurred-edged use of the term "end-to-end encryption," which – if you've actually read this far – I'll leave you to evaluate.
The EFF authors' conclusion is summarized in their final paragraph: "people have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users’ devices." I tend to agree with the point about the rights of minors, but again, minors' ability to use their devices freely is more directly affected by parental controls than Apple's features. If the EFF is going to take that strong a stance on those grounds, don't they have to take a much stronger stance on the wide array of parental controls which have been available on iOS (and other systems) for years, limiting children's ability to view media or use apps and the Internet at parents' discretion? In any case, it seems these authors' idea of an acceptable balance between safety and privacy is that even a mechanism which reveals absolutely nothing to Apple in the absence of predetermined amounts of illegal content, on services users can freely stop using, does not sufficiently favour privacy.
Finally, an Apple internal memo obtained by 9to5Mac included a note from the National Center For Missing And Exploited Children's executive director of strategic partnerships, Marita Rodriguez. Among her congratulations to Apple employees on their work on these features was this remark: "we know that the days to come will be filled with the screeching voices of the minority. Our voices will be louder."
To be charitable, I'd say that's bizarre. To be more speculative, it's the kind of thing I'd expect from a jaded, battle-weary or insecure mind which – even Apple's staff is all but invited to speculate, as this remark came with no further clarification about who this minority might be – is primed to classify any form of concern as dismissible rather than reasonable. I think even "bizarre" is unseemly for a director of this equation's specially privileged and entrusted organization.
All three of these examples of writing (Snowden's, the EFF authors', and Marita Rodriguez's), discourage me, especially because I believe they're all well-intentioned. Fuzzing the facts helps nobody, least of all the fuzzers. I expect that from dimmer minds, not brighter ones.
Fortunately, there are another three sources who seem to insist on clearing up misunderstandings rather than promoting them. The first is Rene Ritchie, who I opened with, and who addresses a whole branch of other reasonable, related questions I won't here.
Second is John Gruber, whose original column was one of the earliest on the issue, and remains one of the clearest. One guess of Gruber's is that this move is in fact a step toward full end-to-end encryption of everything in iCloud, ultimately the opposite of a privacy-compromising net shift.
Third is Apple themselves, who – to their credit, I believe – have only provided more detail as the days have passed, having recognized and responded to perceived concerns. The original announcement included a short stack of white papers on the technology itself, and more recently, they've released an FAQ document addressing what I think are the commonest questions and points of misunderstanding.
I said I'd aim to conclude with observations I hadn't seen mentioned, specifically about the future, as some critics are worried more about the implications than the announcement itself. ("If it's sexually explicit material involving children today, why shouldn't it be thought crime tomorrow?", and so on.) I agree it's a question. But the question that matters isn't what Apple could hypothetically do in the minds of the saddest and wisest (or otherwise), but what they will do. So, what indications do we have about what Apple will do?
It's been almost a decade since Steve Jobs died, and his contemporaries know his stance on privacy was a stark and simple one. I don't have the direct quote, but – more by coincidence, not even on the subject of child safety – I remember him also saying on stage with Walt Mossberg, something like, "we worry about this stuff. We worry that a 14-year-old somewhere is going to get in trouble because of our phone."
Would Steve have approved of this announcement, or something like it? I think so. One of Steve's trademark introductions to a feature was to highlight the traditional belief that two conflicting issues could never be reconciled, only to reveal they finally thought they'd figured out how they could be. And when "Apple" had made a decision based on Steve's personal conviction about what was right, you could see it on his face.
I miss the older Apple keynotes – not just the pre-pandemic live broadcasts from the stage, but the more casual and off-the-cuff presentations afforded by the available time to announce fewer things. They allowed people like Steve to explain their thinking first and their resulting project second, conveying the humanity behind the intentions and defusing suspicions about hidden motives. It feels like one of those monologues is what's missing this time.
I was encouraged, as I mentioned, that people seem alert even to the slightest potential privacy infringement. I'm not sure the most sensitive, ironically, have considered that this public sensibility was cultivated largely by Apple. It doesn't seem a stretch to imagine that without Apple's contribution to this particular zeitgeist, we might today live in a world where we expect every company whose software we use to possess a full profile of our personal information, to quietly exchange aggregate data about our communications, movements and interests – in which we shrug with learned apathy when asked whether that state of affairs bothers or disturbs us. Feature by feature, it feels like Apple has succeeded repeatedly in jostling other companies from a stupor, as they eventually follow suit, or at least feel pressured to take the infantile step of pretending they similarly care. With full encryption in Messages and FaceTime, differential privacy, Intelligent Tracking Protection in Safari, the upcoming Mail Privacy Protection and the upcoming iCloud Private Relay – not even to mention their firm near-absolutist stance against adding an actual back door when pressured by the FBI – the list goes too far on to begin to illustrate the vehemency through which they've shown they believe responsibility comes alongside their power.
How easily could a company with this record slide down the slippery slope some so reflexively imagine? If Steve Jobs' spirit will forever be the foundation of Apple, as Tim Cook has occasionally said, then we needn't worry about the future. But as Jobs also said on that particular stage, "the future is long," and any organization's course is in the hands of its current stewards. Your own mind, meanwhile, is stewarded by you.