Apple Challenges Anyone to Find a Flaw in Darling AI and Offers a $1 Million Reward.
Apple is very proud of the privacy features surrounding Apple Intelligence. It is very proud to offer significant amounts to anyone who finds any privacy-related issues or a code transporter attack in its code. Apple’s first bug bounty program for its artificial intelligence offers a hefty $50,000 to anyone who finds any unintentional data disclosure, but the real prize is a million dollars for a remote attack on Apple’s new cloud processing.
Apple first announced its Private Cloud Computing in June, while also detailing all the new artificial intelligence features coming to iOS and iPadOS, and eventually MacOS. The most important aspect of Apple’s artificial intelligence was Siri, reactivated and able to work across apps. As demonstrated, Siri can access your text messages for some information about the birthday of your cousin that your mom sent you, then pull additional information from your emails to create an event in the calendar. This also involves data processing through Apple’s internal cloud servers. Apple will manage a treasure trove of user data that most people want to keep private.
To maintain its reputation as a company committed to privacy, Apple says that Private Cloud Compute is an additional layer of software and hardware security. Simply put, Apple claims your data will be secure, and it will not retain or cannot retain your data.
This leads us to the security rewards program. In a post on Thursday, Apple’s security team invited “all security researchers – or anyone with a technical interest and curiosity… (to) independently verify our claims.”
So far, Apple has allowed internal third-party auditors to roam, but this is the first time it has opened it to the public. It provides a security guide and access to a virtual research environment to analyze PCC inside macOS Sequoia 15.1 developer preview. You will need a Mac with an M-series chip and at least 16GB of RAM to access. Cupertino provides the source code for cloud computing on GitHub.
In addition to calling all hackers and script kiddies to the table, Apple offers a wide range of payouts for any security flaws or issues. The base amount of $50,000 is allocated only “for the accidental or unexpected data disclosure” but you can get $250,000 for “access to user data requests or sensitive information about user requests.” The top reward of one million dollars is reserved for “illegal execution of code with arbitrary entitlements.”
It is a sign of Apple’s confidence in this system, but at least the open call can allow more people under the hood through Apple’s cloud operations. The initial release of iOS 18.1 is expected to reach iPhones on October 28. There is already a preview release of iOS 18.2 which grants users access to ChatGPT integration. Apple forces users to grant permission to ChatGPT before they can see any of their requests or interact with Siri. OpenAI’s chatbot program is just a temporary stopgap before Apple gets to put its full artificial intelligence in place.
Apple promotes its strong record on privacy issues, even though it has a tendency to track users within its own software systems. In the case of PCC, Apple claims it will have no ability to check your records or requests with Siri. Anyone who accesses the source code may be able to verify the privacy claims of the tech giant before Siri finally gets its upgrade, likely sometime in 2025.