Report: Apple Significantly Undercounts Child Sex Abuse Materials on iCloud and iMessage After years of controversy over plans to scan iCloud for child sexual abuse materials (CSAM), Apple abandoned those plans last year. Now, child safety experts accuse the tech giant of failing to flag CSAM on its services, including iCloud, iMessage, and FaceTime. They also allege that Apple is not reporting all the CSAM it does flag. The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) shared data with The Guardian, showing that Apple is “vastly undercounting” CSAM found on its services globally. According to the…
Read More