Apple Intelligence bug bounty invites researchers to test its privacy claims

Apple is calling for investigations into the Private Cloud Compute (PCC) system, which supports more computationally intensive Apple Intelligence queries. The company is also expanding its bug bounty program, offering payouts of up to $1,000,000 to people who discover PCC vulnerabilities.

The company has boasted about how many AI functions (branded as Apple Intelligence) can run on the device without leaving your Mac, iPhone or other Apple hardware. However, for more difficult requests, they are sent to PCC servers based on Apple Silicon and a new operating system.

Many AI applications from other companies also rely on servers to fulfill more difficult requests. Yet users have little visibility into the security of these server-based operations. Apple has, of course, placed a lot of emphasis on how much it cares about user privacy over the years, so poorly designed AI cloud servers could put a dent in that image. To prevent this, Apple says it designed the PCC so that the company's security and privacy guarantees are enforceable and security researchers can independently verify these guarantees.

For researchers, Apple offers:

With the Bug Bounty, Apple is offering payouts of $50,000 to $1,000,000 for vulnerabilities discovered in various categories. Apple will also review any security issue that “has a significant impact on PCC” for a potential reward.

The first Apple Intelligence features are expected to be available to everyone next week with iOS 18.1. Some of the larger Apple Intelligence features, including Genmoji and ChatGPT integration, appeared in the first iOS 18.2 developer beta released yesterday.

Leave a Comment

url url url url url url url url url url url url url url url url url url url