Apple’s recently announced child safety measures have been met with a cacophony of responses, both critical and supportive, from a variety of sources. International Justice Mission (IJM), through its Center to End Online Sexual Exploitation of Children, applauds Apple for its new child safety initiatives related to iCloud Photos and Messages specifically.
While Apple’s forthcoming initiatives are imperfect—and significant room exists across the tech sector to improve detection, disruption, and reporting of child sexual abuse—Apple’s move is a positive step forward.
For that reason, it should not be delayed.
It is an issue of privacy, but whose?
Strong opposition to Apple’s announcement is primarily presented under the banner of “privacy.” Common objections describe a slippery slope toward government abuses and mass surveillance. To be clear, the child safety solutions proposed by Apple have not been corrupted for such dire ends.
Critics fear a hypothetical future risk while apparently dismissing a very real, current, and widespread harm: Untold numbers of vulnerable children have been and are being abused, exploited, and otherwise victimized by the continued production, possession, and distribution of such images.
The current conversation risks elevating the hypothetical corruption of child safety solutions over the known and rampant misuse of existing technology to harm children.
As a survivor-centered organization, IJM deeply respects what survivors of child sexual abuse tell us and others in this space: children are entitled to have every image memorializing of the most painful and dehumanizing moments of their lives detected, reported, and removed from illicit circulation.
In contrast, offenders have no legal or privacy right to illegally create, possess, or share child sexual exploitation material. In fact, these acts undeniably violate the privacy of victimized children.
The survivors' voice is missing from the table
Unlike the hypothetical harm critics fear, the global crisis of child sexual abuse and exploitation—which Apple and others seek to counter through various safety initiatives and tech innovations—is all too real. In the face of privacy arguments against these safety measures, the Phoenix 11, a group of child sexual exploitation survivors, have rightly identified that this advocacy in the name of privacy is incomplete:
“What about our right to privacy? … It is our privacy that is violated each time an image of our child sexual abuse is accessed, possessed or shared.”
While others have provided more technical reviews of Apple’s plans, the voices of survivors have not been sufficiently prominent. We commend the Phoenix 11’s courageous advocacy for themselves and others.
IJM has also seen firsthand the harm and trauma children experience when sexual abuse and exploitation go undetected and unreported. We’ve also seen the very good reality of protection and hope when that abuse is uncovered and those victims are identified.
In the Philippines, IJM has supported law enforcement since 2011 in safeguarding over 850 people from livestreamed sexual abuse and exploitation by in-person traffickers whom online sex offenders pay for new abusive content.
Among those protected are children like Joy, Ruby, Cassie, Chang, and Marj.
Ruby*, now a survivor leader as an adult, shares the trauma she endured as a child:
“I felt disgusted by every action I was forced to do just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape, to the point that I would shout whenever I heard a police siren go by, hoping somebody would hear me.”
Marj* was first exploited at the age of 13 by her friend’s older sister:
“I was confused because I was just a child. I was shaking. Then, I felt different. I felt ashamed. But I also had nowhere else to go.”
The act of forcing her to take explicit pictures was painful enough, but as Marj shared with IJM:
“…that abuse, I did not expect that it would spread. That it would be sent to other people.”
Take it from the Phoenix 11, Ruby, Marj, and others: Survivors are harmed first by the abuse they suffered, and then repeatedly through the violation of their privacy by offenders who possess and share images depicting their sexual exploitation. Apple’s new safety features are a step toward protecting the privacy of survivors while reasonably respecting the privacy of its users.
It’s not just about Apple
This is not about a single company. Improving the tech industry’s detection, disruption, and reporting of child sexual abuse is critical to protecting victims and survivors from ongoing harm. Innovations like on-device solutions hold significant promise precisely because of the potential to balance user privacy with child protection.
Fortunately, child safety leaders across the tech sector have expressed their commitment. “We are resolved to drive forward the improvements in technology and systems that will ultimately eradicate the online sexual abuse and exploitation of children on our platforms,” said the Technology Coalition’s Executive Director in its first-ever annual report.
Recent child safety announcements by Technology Coalition members Apple, TikTok and Google are steps in the right direction, with much more to be done. Yet the current backlash against these efforts could discourage the development and adoption of additional real-world protections for children.
That would be a missed opportunity for both child protection and true privacy.