CHILD SEXUAL ABUSE HAS BECOME A CRISIS ONLINE.
TELL CONGRESS TO PASS THE STOP CSAM ACT NOW.
- combat the proliferation of child sexual abuse material that’s online
- mandate tech companies help victims by promptly removing their images
- increase accountability, reporting, and transparency of online platforms
- impose legal consequences for platforms who fail to remove it and report it
Surviving a Monster
Helping Victims of Monsters
In Pursuit of Monsters
BEING AWARE OF HOW MUCH CHILD SEXUAL ABUSE MATERIAL IS ONLINE AND TAKING ACTION IS HOW WE #TAKEITDOWN
Child Sexual Abuse Material (CSAM) is any content that depicts sexually explicit activities involving a child. This devastating crime has severe and lifelong consequences for survivors, and the sharing of CSAM contributes to their retraumatization long after the abuse has ended.
In the late ‘80s, child sexual abuse material was all but eliminated. New laws and increased prosecution simply made it too risky to possess or distribute through the mail. But then the internet came along. Child sexual abuse material could now be produced, consumed and distributed anonymously. Facilitated by high-speed broadband and end-to-end encryption, live-streaming, gaming platforms and social media, the amount of child sexual abuse material circulating online has exploded – and not enough is being done to keep children safe on technology company platforms.
While U.S. tech companies are legally required to report child sexual abuse imagery once they’ve been made aware of it, they’re not required to proactively search for it. There is also no punishment for platforms that don't remove it quickly, and there are no standards for transparency and accountability.
ChildFund supports the Stop CSAM Act because it addresses these issues and gives victims a way to hold tech companies accountable for their failure to remove this content.