Thorn
@thorn
We transform the way children are protected from sexual abuse and exploitation in the digital age.
A8. Right now, we are really hoping to see the Take It Down Act quickly signed into law. The House of Representatives passed it yesterday evening with significant support and we are hopeful that it makes its way to President Trump’s desk soon. #NCAPM25
Q8. What would you like to see happen in 2025 to strengthen online protections for children through legislation or policy? #NCAPM25
The Take It Down Act has officially been signed into law. The new legislation is important because it criminalizes the distribution of intimate images—whether real or AI-generated—shared without consent, and holds platforms accountable to act quickly. thorn.org/blog/thorn-sup…

During #NationalChildAbusePreventionMonth, @FBI launched Operation Restore Justice—a 5-day, nationwide effort that led to the arrest of 205 alleged child sex offenders and rescuing 115 children. Hear directly from Darren Cox about how the ongoing work to safeguard children.
It’s important to start conversations about digital safety now—before kids find themselves in risky situations. Our parent discussion guide, "Starting to Make 'Friends' Online," is designed to help: parents.thorn.org/guides/startin…

In 2024, @AmazonPhotos used @GetSaferio to detect and report more than 30,000 images—up nearly 25% from the year before. That impact is only possible through deep investment, innovation, and collaboration. Read the full report: aboutamazon.com/news/policy-ne…

How do we #protectkids in a world where danger is just a swipe away? @Bloomberg latest piece, "How to Keep Your Kids Safe Online," asks a critical question: What can families do right now? bloomberg.com/news/articles/…
A5. The images created on nudifying apps can be used for sextortion, grooming, & other harms. These images circulate & the emotional toll is real. We need platform accountability, early education, & safety by design to prevent this threat from escalating even further. #NCAPM25
Q5. “Nudify” apps use AI to remove clothing from photos, often without consent. What risks do these tools pose to minors, and how can we effectively address them? #NCAPM25
A3. GAI is being used to create CSAM—either generating synthetic sexual images of children from scratch or altering real photos into abuse content. It's also being used to groom and sextort children at scale. We are at a critical moment for safety by design #NCAPM25
Q3. How is Generative Artificial Intelligence (GAI) being used to create exploitative content involving children? #NCAPM25
Join us TOMORROW from 2–3 PM EST for an X Chat marking National Child Abuse Prevention Month. Let’s amplify each other’s voices on recognizing the signs, supporting survivors, and taking action to stop online child exploitation. #NCAPM25
42% of minors who engaged in commodified sexual interactions—where they’re offered money or something of value in exchange for a sexual interaction online—say the person asking was another minor. Read the full findings and what they mean: thorn.org/blog/commodifi…

In 2024, more platforms than ever deployed @GetSaferio—our comprehensive child sexual abuse material and child sexual exploitation detection solution. Read Safer’s 2024 Impact Report to learn more about how our tech helps content moderators: thorn.org/blog/today-the…

🚨 1 in 4 young people report being offered something of value in exchange for a sexual interaction online before turning 18. Everyone has a role to play in protecting kids growing up in our digital-first world. Read our latest report: thorn.org/blog/commodifi…
1 in 8 teens (aged 13-17) know someone who has been targeted by #AIgenerated deepfake nudes. This recent @USATODAY article tells the story of a teenage girl victimized and impacted by deepfake nudes created by her fellow classmate. usatoday.com/story/life/hea…