New child safety features coming to iOS 15, iPadOS 15, and macOS Monterey - TechnW3

Apple's new features aim to protect children and limit the spread of Child Sexual Abuse Materials (CSAM) with new features.

What you need to know

  • Apple commits to add extra protection for children across its platforms.
  • New tools will be added in Messages to help protect children from predators.
  • New tools in iOS and iPadOS will help detect CSAM in iCloud Photos.

It's an ugly part of the world we live in, but children are often the target of abuse and exploitation online and through technology. Today, Apple announced several new protections coming to its platforms — iOS 15, iPadOS 15, and macOS Monterey — to help protect children and limit the spread of CSAM.

The Messages app will be getting new tools to warn kids and their parents when they receive or send sexually explicit photos. If an explicit photo is sent, the image will be blurred, and the child will be warned. On top of been warned about the contact, they will also be presented with resources and reassurances that it is okay not to view the photo. Apple also states parents will be able to be notified that their child may not be okay.

" As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it."

Apple also reassures that these new tools in Messages use on-device machine learning to identify the images in a way that does not allow the company to access the messages.

CSAM detection

Another big concern Apple is addressing is the spread of CSAM. New technology in iOS 15 and iPad OS 15 will allow Apple to detect know CSAM images stored in iCloud Photos and report those images to the National Center for Missing and Exploited Children.

" Apple's method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."

The exact processes can be complicated, but Appel assures the method has an extremely low chance of flagging content incorrectly.

"Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."

Siri and search updates

Lastly, Apple also announced that Siri and Search would provide additional resources to help children and parents stay safe online and get help if they find themselves in unsafe situations. And on top of that, Apple says they will be protections for when someone searches for queries related to CSAM.

"Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."

These updates are expected in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey later this year.

- TechnW3
from iMore - Learn more. Be more.
via TechnW3

No comments: