Apple made a very big deal about Apple Intelligence’s privacy credentials when it launched the AI suite earlier this year. There has been some skepticism about those claims, especially from people like Elon Musk who took particular offense to Apple’s partnership with ChatGPT. But now Apple is putting its money where its mouth is, launching the first Apple Intelligence Bug Bounty.
Specifically, Apple is inviting hackers to investigate the Private Cloud Compute (PCC) feature. While on-device AI is inherently more private because all the data stays on the phone, cloud-computing is a different matter. PCC is Apple’s attempt to fix that issue, and offer cloud-based AI processing without compromising on data security and user privacy.
But clearly Apple isn’t expecting us all to take its word for it, and is actively inviting security researchers and “anyone with interest and a technical curiosity” to independently verify the company’s claims about PCC. It would be a huge blow to Apple if this system were somehow compromised and bad actors got access to supposedly-secure user data.
The point of bug bounties is to incentivize hackers and other security professionals. Hackers are an intrepid bunch, and can often find ways to stress test systems that in-house developers never thought of. And by reporting any problems they come across, they form a mutually beneficial arrangement with Apple. Apple gets to fix security flaws quietly, without user data being exposed to the wrong people, and the hackers get paid for their effort.
In the case of PCC, Apple is offering various rewards depending on the issue reported, but the maximum has now been increased to $1 million. That sum is only available for “Arbitrary code execution with arbitrary entitlements” during a “remote attack on request data." That should tell you how seriously Apple is taking this, or how confident it is that PCC is secure.
Overall, a million dollars is a small price to pay to avoid the PR disaster that would occur if criminals found a way in.
To facilitate this, Apple is offering various tools and resources to aid bug bounty hunters in their work. They include a security guide with PCC’s technical details, source code for ““certain key components of PCC that help to implement its security and privacy requirements” and a “Virtual Research Environment” for doing security analysis of PCC. The latter requires you to have a Mac with Apple Silicon, at least 16GB of RAM and access to the macOS Sequoia 15.1 developer preview.
Privacy is always a concern when you’re using online services, and cloud-AI is absolutely no different. Thankfully Apple does still seem to be sticking with its usual Privacy-centric mandate, and is openly looking for ways to ensure things are secure. It won’t please everyone, but it’s better than nothing.
More from Tom's Guide
- iPhone 17 Pro Max could get a smaller Dynamic Island — here’s how
- I just went hands on with Visual Intelligence on the iPhone 16 — this is huge
- Samsung Galaxy S25 Ultra probably won't match the iPhone 16 Pro for display brightness — here's why