ⓘ Featured image: Tim Cook, CEO of Apple Inc.; Credits: Reuters
The global tech giant Apple Inc., on Thursday announced its Child Sexual Abuse Material (CSAM) detection tool will begin scanning photos users upload to their iCloud Photo Library for child abuse material, and will lock users out should multiple images be detected.
Besides this, Apple is also adding an image scanning tool for minors in Messages, so both children and parents can be aware of when they’re viewing explicit imagery. The Messages feature, designed for minors, will alert parents when sexually explicit imagery is being shared, warning the minor not to view the content, and alerting the parent should they decide to look. Apple says the image scanning tool was trained on pornography, and can detect sexually explicit content.
In order to utilise the child-safety features, parents have to opt-in to the communication safety feature in Messages, and will be notified if an explicit image is opened if the child is 12 years or younger.
Children age 13 through 17 will still receive a warning when opening an image, but their parents will not receive a notification should the image be opened.
Apple’s new set of child safety features, made by collaborating with the National Center for Missing and Exploited Children (NCMEC), seeks to protect children from abusive activity and sexual imagery.
ⓘ LAFFAZ is not responsible for the content of external sites. Users are required to read and abide by our Terms of Service.