Apple is introducing a new feature to iMessage in Australia that will allow children to report nude images and videos sent to them directly to the company, and potentially report the messages to the police. There is.
The changes were made as part of the beta release of a new version of Apple's operating system for Australian users on Thursday. This is an extension to the communications safety feature that's been enabled by default for Apple users under 13 since iOS 17, but is available to all users. With existing safety features, iPhone automatically detects images and videos containing nudity that your child might try to receive or send in iMessage, AirDrop, FaceTime, or Photos. Detection occurs on your device to protect your privacy.
If a sensitive image is detected, young users will be shown two intervention screens before continuing, offering resources or a way to contact a parent or guardian.
This new feature also gives users the option to report images and videos to Apple if a warning appears.
The device creates a report that includes the image or video and the messages sent just before and after the image or video. This includes contact information for both accounts and allows users to fill out a form explaining what happened.
This report will be reviewed by Apple, who can take action against the account (such as disabling the user's ability to send messages via iMessage) and report the issue to law enforcement.
Apple said it initially plans to roll out the feature in Australia with the latest beta update, but plans to release it globally in the future.
The timing of the announcement and the selection of Australia as the first region to introduce the new feature coincides with the new terms coming into effect. By the end of 2024, technology companies will be required to crack down on child abuse and terrorist content on cloud and messaging services operated in Australia.
Apple had warned that the draft code would not protect end-to-end encryption, leaving the communications of everyone using its services vulnerable to mass surveillance. Australia's eSafety Commissioner ultimately watered down the law, allowing companies that believe it would break end-to-end encryption to demonstrate alternative measures to address child abuse and terrorist content.
Apple has come under intense criticism from regulators and law enforcement agencies around the world for its reluctance to compromise iMessage's end-to-end encryption for law enforcement purposes. In late 2022, Apple abandoned plans to scan photos and videos stored on iCloud products for child sexual abuse material (CSAM), inviting further criticism. Apple, WhatsApp, and other pro-encryption groups argue that backdoors to encryption are putting user privacy at risk around the world.
Source: www.theguardian.com