[Shots] Apple to scan user photos to detect Child Sexual Abuse Material

Apple user photos Child Sexual Abuse Material
  • Facebook share
  • Twitter share
  • Linkedin share
  • Pinterest share
  • WhatsaApp share

Featured image: Tim Cook, CEO of Apple Inc.; Credits: Reuters

The global tech giant Apple Inc., on Thursday announced its Child Sexual Abuse Material (CSAM) detection tool will begin scanning photos users upload to their iCloud Photo Library for child abuse material, and will lock users out should multiple images be detected.

Besides this, Apple is also adding an image scanning tool for minors in Messages, so both children and parents can be aware of when they’re viewing explicit imagery. The Messages feature, designed for minors, will alert parents when sexually explicit imagery is being shared, warning the minor not to view the content, and alerting the parent should they decide to look. Apple says the image scanning tool was trained on pornography, and can detect sexually explicit content.

In order to utilise the child-safety features, parents have to opt-in to the communication safety feature in Messages, and will be notified if an explicit image is opened if the child is 12 years or younger.

Children age 13 through 17 will still receive a warning when opening an image, but their parents will not receive a notification should the image be opened.

Apple’s new set of child safety features, made by collaborating with the National Center for Missing and Exploited Children (NCMEC), seeks to protect children from abusive activity and sexual imagery.


Want to get your content featured on LAFFAZ? Submit your article ➜.


About the author

Wajiha Wahab

Part of the editorial team at LAFFAZ, Delhiite by birth, Wajiha possesses a keen interest in reading about startups, accumulating information and presenting the same to the audience impressively.
Connect: LinkedIn

View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *