In this living document, I
will document reactions to uses of homomorphic encryption by members of the
public. By “member of the public,” I mean people who may be technical,
but are not directly involved in the development or deployment of
homomorphic encryption systems. This includes journalists, bloggers,
aggregator comment threads, and social media posts.
My main goal is to understand how the public perceives the risks
and benefits of using homomorphic encryption.
As such, I will refrain as best I can from inserting my own commentary
about the risks and benefits of using homomorphic encryption,
or counter opinions to those I cite.
For background on FHE, see my overview of the
field.
For examples of production FHE deployments,
see this living document.
If you come across any examples not in this list,
please send me an email
with a link.
Table of contents:
Response to Apple’s Enhanced Visual Search for photos
In October 2024 Apple released a new Photos feature called Enhanced Visual
Search,
enabled by default in iOS 18 and MacOS 15. This feature uses homomorphic
encryption as part of a larger protocol to allow Apple to run a private
nearest-neighbor search on user photos to identify landmarks and such
based on an Apple-internal corpus of photos.
On 2024-12-28 Jeff Johnson wrote a blog post, Apple Photos phones home on
iOS18 and macOS 15 in
which he lambasts Apple for violating privacy. Johnson is a software engineer
who seems to focus mainly on MacOS apps. He is also the sole proprietor of
Underpass App Company, which designs apps that do various forms of “blocking”
of unwanted behaviors in web browsers. He is credited for various security
vulnerabilities in MacOS and iOS. He wrote a followup
article three days later
in which he doubles down on his complaints.
In his first article, he writes
From my own perspective, computing privacy is simple: if something happens
entirely on my computer, then it’s private, whereas if my computer sends data
to the manufacturer of the computer, then it’s not private, or at least not
entirely private. Thus, the only way to guarantee computing privacy is to not
send data off the device.
Johnson’s avoids commenting on the merits of homomorphic encryption.
He claims he doesn’t have the expertise or desire to do so, and moves past it.
Instead, Johnson seems to mainly care about the practical side of the
security problem: that software has bugs and the more code you write the more
likely there will be some kind of security vulnerability. He writes:
You don’t even have to hypothesize lies, conspiracies, or malicious
intentions on the part of Apple to be suspicious of their privacy claims. A
software bug would be sufficient to make users vulnerable, and Apple can’t
guarantee that their software includes no bugs.
He later goes further to say that, even if the software had no bugs, he
would still oppose the feature being imposed on him. In his second article,
Johnson writes:
My argument was that Enhanced Visual Search offered me no benefits at all.
I’ve literally never wanted to search for landmarks in my Photos library.
Thus, in this specific case, no amount of privacy risk, no matter how small,
is worth it to me.
This “no risk can justify zero reward” attitude is reasonable, but it doesn’t
help me understand how much risk people see in using homomorphic encryption. If
Johnson did want to identify landmarks in his photos, would this be
sufficient protection? What feature would be useful enough to Johnson to
justify using homomorphic encryption?
In a third article, Johnson
underscores that technology is not a replacement for consent, no matter how
good the technology.
Johnson’s article scored 1320 points on HackerNews,
making it the number 1 post on 2024-12-29
(beating the news of Jimmy Carter’s death).
Most of the commentary was about opt-in versus opt-out software features,
and the complexity of what people are agreeing to when they give consent
versus their comprehension of what they are agreeing to,
with many concurring with Johnson about not sending user data over the network
for any reason without explicit consent.
Some comments went far off the deep end, suggesting
this feature is effectively a back-alley method
of re-inserting CSAM scanning efforts into Apple’s software,
so that flagged photos could be sent to law enforcement.
Others pointed out many flaws in other commenters’ understanding,
with comments akin to “read the fucking article.”
One cryptographer, Matthew Green, chimed in
to concur that while he could understand the system
and evaluate its risks,
it was enabled by default while he was on holiday
and so he wouldn’t have had the time to do so anyway,
suggesting that “enabled by default” was the main issue.
Many commenters seemed to claim this was “private information being sent to Apple,”
without understanding what was actually being sent.
So this raises a question that I would like to ask anyone I can about this topic:
if you send an encryption of sensitive data to a company,
and they don’t have the decryption key,
does that constitute “sharing sensitive data”
or is that “private”?
This blog post was directly or indirectly referenced by a variety of
Apple-specific online publications, including
- MacWorld,
which tried to convince the reader the feature is secure - 9to5Mac,
whose take was essentially, “meh, seems fine” - Tom’s
Guide,
whose headline suggests a risk, but the article contains no new information.
On 2025-01-10, Evan Boehs published an article
which, in response to Johnson’s article, provides
a more detailed explanation of Apple’s protocol.
(Disclosure, Boehs cites my blog’s overview of FHE)
In the end, Boehs decides:
I myself, having put in the time to piece together a huge pile of scattered
information, have decided I like the feature and will leave it enabled. With
that being said, technical understanding is no substitute for consent, which
should have been requested by Apple along with a proper explanation.
Boehs’s article was also on
HackerNews, and the
comments continue with opt-in vs. opt-out concerns, but some commenters
demonstrate optimistic curiosity about FHE technology.
On March 26, 2025, Rehan Rishi and Haris Mughees,
two of the developers at Apple on this project,
gave a talk at the Real World Crypto conference.
The video can be found on YouTube.
They did not address the concerns raised above,
though one slide addressed it indirectly:

A slide from Apple’s RWC2025 talk that states ‘Apple’s privacy commitment: Client queries remain private, even from Apple’
Compare this to the billboard highlighted in Jeff Johnson’s article:

A billboard for the iPhone that states ‘What happens on your iPhone, stays on your iPhone.’
Leave a comment