About Me

Categories

Apple delays child abuse photo scanning planned for iOS 15
Business

Apple delays child abuse photo scanning planned for iOS 15

Apple announced a new child safety policy to automatically scan the user photos for the child sexual assault (CSAM) material last month, stimulating a protest of the Privacy Defenders and consumers about the violations of the Privacy Rights and The possible government holding. Now Apple is delaying the deployment of technology to request feedback ‘in the coming months’ before its totalization.

Apple previously planned to include its CSAM scanning technology and an optional policy that accompanies it to detect sexual content in IMSages for young people in IOS 15 and iPadas 15, which is expected to be launched by the iPhone 13 (rumored that will be revealed on 14 September). He would have gone to live in the US. UU, no undersigned plans for a global implementation. Apple’s full statement from Apple on delay, by TechCrunch:

“Last month we announce plans for the features aimed at helping to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material. On the basis of customer comments, groups Defense, researchers and others, we have decided to take an additional time during the coming months to collect information and make improvements before dropping these critically important child safety features. “

Shortly after the introduction of new policies at the beginning of August through a blog publication, Apple followed with a frequently asked questions of multiple pages that give detailed explanations on how both the CSAM scanning and the Imessage exam would work.

Apple planned to use your call Neuralhash technology to automatically scan the photos to see if they matched the Hashes of the known CSAM material. The technology only scanned images as ICloud (which is encrypted) were charged.

But the potential of governments to take advantage of automatic photographic scan policy for their own uses had alarmed privacy advocates and industry groups, the Frontier Electronic Foundation (EFP) criticized the company to build any type of ‘rear door’ In user data, while the Center for Democracy and Technology (CDT) accumulated a coalition that deciphered how the photo scan could be abused by governments that sought objectionable material.

The CDT also established how another Apple policy was planned to deploy along with the CSAM photo scan: an optional feature in Imessage that diffuses images with sexual content sent to users under 13 years of age and notifies parents linked to the same account Of the family, they could “threaten the safety and well-being of some young people, and young LGBTQ + with unfriendly parents are particularly at risk.”

Finally, Apple will also allow Siri and search to provide more useful resources for users to request CSAM, as well as intervening with warnings and support resources when users seek material related to CSAM. It is not clear if this will also be delayed.

Leave a Reply

Your email address will not be published. Required fields are marked *