Apple is offering a reward of up to $1 million to anyone who can hack its new fleet of AI-focused servers meant for Apple Intelligence, which launched this week.
As PCMag reports, Apple is asking researchers to test the security of “Private Cloud Compute,” the servers that will receive and process user requests for Apple Intelligence when the AI task is too complex for the on-device processing of an iPhone, iPad, or Mac.
To address privacy concerns, Apple designed Private Cloud Compute servers to immediately delete a user’s request once the task is fulfilled. In addition, the system features end-to-end encryption, meaning Apple cannot uncover the user requests made through Apple Intelligence, even though it controls the server hardware.
Still, Apple has invited the security community to vet the privacy claims around Private Cloud Compute. Cupertino started with a select group of researchers, but on Thursday, the company opened the door to any interested members of the public.
Apple is offering access to the source code for key components of Private Cloud Compute, giving researchers an easy way to analyze the technology’s software side. The company also created a “virtual research environment” for macOS that can run the Private Cloud Compute software. Another helpful tool is a security guide that covers more technical details about the company’s server system for Apple Intelligence.
“To further encourage your research in Private Cloud Compute, we’re expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC,” the company added.
Rewards include $250,000 for discovering a way to remotely hack Private Cloud Compute into exposing a user’s data request. Apple is also offering $1 million if you can remotely attack the servers to execute rogue computer code with privileges. Lower rewards will be granted for security research that uncovers how to attack Private Cloud Compute from a “privileged network position.”
Apple says it’ll also consider rewards for reported vulnerabilities “even if it doesn’t match a published category.” “We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time,” it says.
—
Photo Credit: PriceM / Shutterstock.com