Hacker News new | past | comments | ask | show | jobs | submit | abalone's comments login

This has gone a bit under the radar but Apple went and built a private search engine that doesn’t know what you’re searching for. It’s called Wally and it powers queries like caller ID and landmark recognition in photos.

* Homomorphic encryption is nothing short of dark magic that makes it impossible for the server to know what you are searching for or what it’s responding with (at considerable greater computational expense).

* To make it scale they added differential privacy techniques that generate fake queries from clients to hide which shards are being queried. So rather than always querying every shard (private but expensive), clients precompute which shards are most likely to hold the nearest neighbor match for their query. By itself this would leak the nature of your query but then they bury these queries in a sea of noise across the fleet.

* They also slightly randomize and batch the queries into epochs so that they can’t even identify time-based traffic patterns.

* It’s all routed through an OHTTP-like private relay that anonymizes requests by stripping IP addresses. It does this by having the client connect to a relay that’s sort of like a VPN except the client passes it an encrypted destination address. The relay then hands this to a second relay run by another company which holds the decryption key, but doesn’t know your IP. So no one party can associate you with your destination. (This is actually the least groundbreaking part of the system.)

Academic paper here with all the details: https://arxiv.org/pdf/2406.06761


According to Apple,

"A randomly generated UID is fused into the SoC at manufacturing time. Starting with A9 SoCs, the UID is generated by the Secure Enclave TRNG during manufacturing and written to the fuses using a software process that runs entirely in the Secure Enclave. This process protects the UID from being visible outside the device during manufacturing and therefore isn’t available for access or storage by Apple or any of its suppliers."[1]

But yes of course, you have to trust the manufacturer is not lying to you. PCC is about building on top of that fundamental trust to guard against a whole variety of other attacks.

[1] https://support.apple.com/guide/security/secure-enclave-sec5...


> Harder to attack, sure, but no outside validation.

There is actually a third party auditor involved in certifying hardware integrity prior to deployment.[1]

But yes, the goal is to protect against rogue agents and hackers (and software bugs!), not to prove that Apple as an organization has fundamentally designed backdoors into the secure element of their silicon.

[1] https://security.apple.com/documentation/private-cloud-compu...


> As soon as you start going down the rabbit hole of state sponsored supply chain alteration, you might as well just stop the conversation. There's literally NOTHING you can do to stop that specific attack vector.

Just want to point out that Apple has designed in a certain degree of protection against this attack, and they talk about it![1]

In a nutshell they do two things: supply chain hardening and target diffusion. Supply chain hardening involves multiple verification checkpoints. And target diffusion greatly limits the utility of a small-scale compromise of a few nodes, because users are not partitioned by node. Together these mean the entire system would have to be compromised from manufacturing to data center and across all or most nodes. Which is certainly possible! But it's a significant raising of the bar above your "run of the mill" state-sponsored shipment interdiction or data center compromise.

[1] https://security.apple.com/documentation/private-cloud-compu...


> There is, for example, as far as I understand it, still plenty of attack surface for them to run different software than they say they do.

I would not say "plenty." The protocol that clients use to connect to a PCC node leverages code signing to verify the node is running an authentic, published binary. That code signing is backed by the secure element in Apple's custom hardware (and is part of the reason PCC can only run on this custom hardware, never third party clouds). So to attack this you'd really have to attack the hardware root of trust. Apple details the measures they take here.[1]

Having said that, it would be a mistake to assume Apple is trying to cryptographically prove that Apple is not a fundamentally malicious actor that has designed a system to trick you. That's not the goal here.

What they are providing a high level of guarantee for is that your data is safe from things like a rogue internal actor, a critical software vulnerability, an inadvertent debug log data leak, or a government subpoena. That's a huge step forward and nowhere near what other architectures can guarantee in an independently verifiable way.

[1] https://security.apple.com/documentation/private-cloud-compu...


> The code is being shown to selected individuals deemed suitable and then they're telling me by way of proxy.

That is incorrect! The binaries are public and inspectable by anyone. The tools for doing it are bundled right into macOS -- like literally on every consumer's machine.[1]

Furthermore the protocol for connecting to a PCC node involves cryptographic proof that it's running a published binary.

[1] https://security.apple.com/documentation/private-cloud-compu...


Binaries != code. A security professional cannot evaluate a remote service by inspecting the binary that (supposedly) runs on a remote system. Even under ideal conditions it's a move that proves you still have something to hide by not just showing people the code that your architectures running on. It's as if Apple will do anything to prove their innocence except removing all doubt.


Your concern is the hardware auditors are not trustworthy because Apple hired them?

I mean that’s fair but I don’t think the goal here is to offer that level of guarantee. For example their ceremony involves people from 3 other Apple organizational units, plus the auditor. It’s mostly Apple doing the certification. They’re not trying to guard too heavily against the “I don’t trust Apple is trying to fool me” concern.

What this does protect you from is stuff like a rogue internal actor, software vulnerability, or government subpoena. The PCC nodes are “airtight” and provably do not retain or share data. This is auditable by the whole security community and clients can verify they are communicating with an auditable binary. It’s not just a white paper.

That’s an enormous step up from the status quo.


PCC is a whole different level. For example, you still have to trust that Google is doing what it says to control access. PCC makes it auditable and verifiable by clients when connecting to a node.

You can also audit that the binaries don’t leak any data in, say, debug logs, which is definitely possible on GCP/Borg. PCC nodes are “cryptographically airtight.”


I'm only here to correct the parent's false claims.


PCC is fundamentally more secure than merely encrypting at rest and auditing access. That still has a variety of attack vectors such as a software bug that leaks data.

Apple is unable to access the data even if subpoenaed, for example, and this is provable via binary audits and client verification that they are communicating with an auditable node.


How is that any different in either direction? Bugs exist in any and all code. Encrypted data is unencryptable if you don't have the keys.

I don't see that apple software is any different in that regard (just try using Mac OS for any length of time even on apple silicon and you run out of fingers to count obvious UI bugs pretty quickly just in day to day usage). And obviously AWS won't be able to decrypt your data without your keys either.

The people running these huge multi-multi-billion clouds are not idiots making fundamental errors in security. This is why they all pay mega salaries for highly skilled people and offer five-figure bug bounties etc - they take this seriously. Would some random VPS or whatever be more likely to make errors like this, sure - but they are not in (and not expected to be) in the same league.


And just to confirm, less than 24 hours a post on HN here about a whole new batch of critical security bugs in Mac OS: https://jhftss.github.io/A-New-Era-of-macOS-Sandbox-Escapes/

I would not trust Apple any more or less than other other big-time cloud provider.


He had my attention until he posted that mind-numbingly generic Ubuntu ad. Most people would tune that out in seconds.

I’m kind of glad Apple is putting a touch more flavor in their ads.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: