Apple had announced a couple of days ago that it would roll out a new feature that would scan the iCloud of users for child sexual abuse images.
- Apple has been in news for quite some time because of its child safety policies.
- Apple has now issued a clarification saying that it will only use its system to scan images that have already been flagged.
- Apple said a threshold of 30 images should be discovered in a person’s phone before Apple’s system alerts the company
Apple has been in news for quite some time because of its child safety policies. Apple had announced a couple of days ago that it would roll out a new feature that would scan the iCloud of users for child sexual abuse images. Apple’s announcement soon invited the ire of the privacy advocates, who massively criticised the move. However, Apple has now issued a clarification saying that it will only use its system to scan images that have already “been flagged by clearinghouses in multiple countries.”
As per Reuters report, Apple said a threshold of 30 images should be discovered in a person’s phone before Apple’s system alerts the company that a human should review and whether or not it should be reported to the authorities. Apple said it would start with 30 but in the following days, the number would go down.
“Before the threshold is exceeded, the cryptographic construction does not allow Apple servers to decrypt any match data, and does not permit Apple to count the number of matches for any given account. After the threshold is exceeded, Apple servers can only decrypt vouchers corresponding to positive matches, and the servers learn no information about any other images. The decrypted vouchers allow Apple servers to access a visual derivative such as a low-resolution version of each matching image,” Apple explained in a long paper.
The Reuters report also reveals that Apple was not happy with how it handled the communications around the upcoming technology. However, Apple did not divulge whether it had made changes in any of the policies following criticism. The Cupertino-giant assured that the technology is still in its developmental phase so changes are expected to take place before a final roll out.
Earlier, it was reported that Apple’s own employees were not happy with Apple’s child safety features. The employees had sent close to 800 messages to an Apple internal Slack channel discussing the company’s move. The employees were worried that the feature could be exploited by repressive governments like China. The feature could be used to track things that could be unrelated to child sexual abuse and people could be spied on by Apple on the government’s insistence using this feature.
Apple had earlier revealed in its blog that the feature will be rolled out in the United States and later Apple will expand it to other countries.