Apple is expanding its Communication Safety features in iOS 18.2 with a new option that will not only blur nude photos and videos in messaging conversations but also give kids the ability to report those messages to Apple.
The Communication Safety in Messages feature debuted in iOS 15.2 nearly three years ago, following controversies over Apple's August 2021 announcement of plans to help prevent child abuse.
It was one of three separate initiatives Apple had planned to roll out as part of what it called "Expanded Protections for Children." The most innocuous of these were Siri, Spotlight, and Safari search updates that would help guide users to appropriate resources when searching for topics related to child exploitation.
However, the one that upset privacy advocates the most was that Apple planned to use a specialized algorithm to scan photos against a database of child sexual abuse material (CSAM) before uploading them to the user's iCloud account. That one quietly died on the table in the face of serious opposition from privacy advocates, who warned it was a slippery slope that could be abused by oppressive regimes simply by replacing the database of CSAM with images of known dissidents or other unfavorable political activities.
Sadly, Communication Safety in Messages got caught in the crossfire. Unlike CSAM detection, which would compare photos in a person's iCloud Photo Library to a database of known images and potentially report significant collections of harmful pictures to law enforcement, Communication Safety in Messages was an extension of parental controls intended to protect children from predators who might send them photos containing nudity (or try and convince impressionable kids to do the same in the other direction).
As Craig Federighi, Apple's senior vice president of software engineering, candidly admitted, the company muddied the waters by announcing that all three were together. This led many folks to sound alarm bells based on the false notion that Apple was scanning all their private communications.
However, the Communications Safety feature doesn't look up photos in databases or report anything to Apple or law enforcement. Instead, machine learning analyzes pictures sent and received in the Messages app to identify those that might contain nudity, blurring them out and requiring the user to take additional steps to view them. No information on these photos ever leaves the child's device. The original proposal would have notified parents of kids under the age of 13 if the child chose to view a photo anyway after being warned; however, Apple walked that back after child safety advocates pointed out how this could lead to child abuse by less understanding parents.
Since it debuted in iOS 15.2, the Communication Safety Feature has remained unchanged. However, the technology behind it worked well enough that Apple expanded it to Sensitive Content Warnings in iOS 17, allowing anyone to blur nude photos and videos automatically.
However, now Apple is taking the Communication Safety feature a step further in iOS 18.2 to allow kids to report unsolicited nudes that they're receiving.
According to The Guardian, Apple is rolling out the feature in Australia due to new safety laws that will come into force by the end of this year requiring tech companies that run cloud services to monitor them for CSAM and pro-terror material.
There's no word on when it will expand beyond Australia; Apple only says it "will be released globally in the future."
In iOS 18.2, the feature will be entirely optional. It also won't compromise the security of iMessage's end-to-end encryption, as it will rely on end users to report material. Apple will have no access to the content of iMessage conversations unless someone chooses to make a report. This is similar to how WhatsApp, which also uses end-to-end encryption, handles abuse reporting.
The reporting option will be added to the two warning screens that already appear when a sensitive image is detected when a younger user's parents have enabled Communication Safety. These already include links to resources to help the youngster and an option to message a parent, guardian, or another trusted adult. However, users will also now have the option to report the image or video to Apple.
According to The Guardian, sending a report will forward the images or videos to Apple, along with the messages sent immediately before and after receiving the photo or video. Contact information from both accounts will also be part of the report, and the user will be able to provide additional information.
The report will only be sent to Apple, where it will be reviewed by trained staff, who will decide what actions to take. These could include anything from suspending the account of the person sending the unsolicited nudes to reporting the issue to law enforcement.