Thorn’s cover photo
Thorn

Thorn

Non-profit Organizations

Manhattan Beach, CA 32,565 followers

About us

We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.

Industry
Non-profit Organizations
Company size
51-200 employees
Headquarters
Manhattan Beach, CA
Type
Nonprofit
Founded
2012
Specialties
technology innovation and child sexual exploitation

Locations

Employees at Thorn

Updates

  • Thorn reposted this

    View profile for Dr. Rebecca Portnoff

    MIT Tech Review 35 | Fast Company AI 20 | Head of Data Science @ Thorn | AI & Child Safety | Princeton, UC Berkeley

    It was great to be back in Dublin this week for Google’s Growing Up in the Digital Age Summit! A range of critical topics were discussed - digital age verification, future risks for online child sexual abuse, AI quality, AI safety and more. Getting to spend time in person with partners, and meet new dedicated professionals driving forward crucial work is always a privilege. I was glad to get the chance to share with the audience Thorn’s ongoing work to prevent the misuse of generative AI for furthering child sexual abuse. As part of the conversation, I had the chance to highlight some key areas where there are still critical gaps, which I’ll share here as well (with more related resources in the comments): 1. We need to research and invest in scalable, reliable model assessments that are not overly reliant on prompt/response strategies. Current strategies for model assessments are inherently manual: they boil down to evaluating using prompts and assessing outputs. Given the pace and scale of newly released models into the ecosystem, and the specific sensitivities of assessing AIG-CSAM related harms, these aren’t sufficient. We need to explore strategies that involve ascertaining a model’s learned concepts (e.g. from its latent space) and reliably mapping these concepts to a model’s capabilities. 2. We need to research and invest in model training strategies that prevent and mitigate adversarial fine tuning downstream. Right now, in the open source/open weight setting, a good faith developer can follow all the best practices for developing a model to reduce or remove its AIG-CSAM capabilities, and then have that work undone by downstream adversarial fine tuning and optimization.  3. We need to move much more quickly to prevent the upload of nudifying apps and services to platforms, and to remove nudifying apps and services from search results. In research that Thorn (Amanda Goharian) released earlier this week, we learned that young people who admit to creating sexual deepfakes continue to have a simple and straightforward path to accessing and using nudifying technologies through app stores, search engines, and social media platforms. In that same research, 6% of American teens disclosed having been a direct victim of this form of abuse. As always, there’s more work to be done - but I’m thankful for the many people who are putting in the effort. Frances Frost John Buckley Jess Lishak Michelle Jeuken and many more!

    • No alternative text description for this image
  • View organization page for Thorn

    32,565 followers

    The risks to children online are increasing in number with evolving methods. From sextortion to AI-generated child sexual abuse material (AIG-CSAM), these threats are growing more prevalent across social media and gaming platforms. CameraForensics’ Child Safety Online Report dives deep into these challenges, highlighting what needs to be done to protect kids in the digital age. Don’t miss this important report.

    View organization page for CameraForensics

    1,004 followers

    Our new report, Child Safety Online, is now live. --> https://lnkd.in/ehVrbiSi    From online gaming platforms to nudifying apps, offenders now have greater access to potential victims, a bigger platform for sharing abuse material, and new tools for creating it. What does this mean for investigators combatting child sexual abuse material (CSAM)?  And what should technology providers and global policymakers do to help? Download the report to find out.   https://lnkd.in/ehVrbiSi A huge thank you goes out to Thorn, Canadian Centre for Child Protection Inc., and Elly Hanson for their guidance.  

  • View organization page for Thorn

    32,565 followers

    What if you could change the world with B2B marketing—all in one day? Now’s your chance! Join us for the TOFU Challenge Marketing Hackathon in NYC on March 27 (9 AM - 5 PM EDT). Team up with B2B marketers in a dynamic, friendly competition to tackle Thorn’s biggest marketing challenges and develop solutions we can implement. You’re invited to: 💡 Create innovative solutions for a meaningful cause 🛠️ Tackle a hands-on, real-life marketing challenge 🎉 Compete in a fun, friendly atmosphere with peers 🤝 Build deeper connections with other marketers 📚 Learn from fellow professionals, mentors, and expert judges Are you a marketer and ready to make a difference? We’d love to have you! 👉Reserve your spot for this FREE Suru Labs event: https://lnkd.in/gJ8xjXJE

    View organization page for Suru Labs

    236 followers

    We've teamed up with Thorn for a one-day B2B Marketing Hackathon in NYC, where marketers will tackle some of Thorn's marketing challenges to help tell their story and amplify their mission. In a world where kids are constantly online, dangers lurk in places we can’t always see. That’s why Thorn exists. They build technology to fight child exploitation, protect vulnerable kids, and help law enforcement track down perpetrators before they can cause harm. Joining this event is an opportunity to use your marketing skills to give back. You'll spend the day collaborating, learning from peers, and coming up with solutions that will have real-world impact. We promise this will be one of the most memorable and inspiring events you’ve ever participated in.  👉 Reserve your spot: https://lnkd.in/gJ8xjXJE

  • View organization page for Thorn

    32,565 followers

    Thorn supports the Senate’s recent passage of the Take It Down Act and encourages the House of Representatives to prioritize this critical piece of legislation as a step toward protecting kids from deepfake nudes. Thorn’s latest research found that 31% of teens are already familiar with deepfake nudes, and 1 in 8 personally knows someone who has been targeted. These manipulated images can be used for harassment, blackmail, and reputational harm, causing significant emotional distress for victims. As deepfake technology grows more accessible, we have a critical window of opportunity to understand and combat this form of digital exploitation—before it becomes normalized in young people's lives—and to act on their behalf to defend them from threats. Read more about Thorn’s support of this bill on our blog: https://lnkd.in/g_7gBZMX

    • No alternative text description for this image
  • View organization page for Thorn

    32,565 followers

    🚨 New research: Deepfake nudes are the latest digital threat harming young people. Nearly 1 in 3 teens have heard of deepfake nudes, and 1 in 8 personally know someone targeted by deepfake nudes. The tools to create these images are easily accessible, and the emotional impact is real. 84% of young people recognize deepfake nudes as harmful to the victim—with emotional distress and reputational damage as their top concerns. While more than half say they’d seek help from a parent, only 34% of victims did. Now is the time to act before this emerging threat spreads and harms even more kids. Thorn’s latest report—"Deepfake Nudes & Young People”—sheds light on how deepfake nudes represent the newest type of non-consensual exploitation and why education, awareness, and tech-driven solutions are urgently needed. Read the full report linked below in the comments.

  • View organization page for Thorn

    32,565 followers

    Sexual predators are targeting children on a new playground: popular video games. We invite you to learn more about this important topic at our inaugural Spring Luncheon on Thursday, April 3, 2025 in NYC. Our panel of experts will explain the dangers in online gaming and provide valuable guidance on keeping children safe from this growing threat: ⭐ Detective Ryan Ellis - Clay County Sheriff's Office, member of the Northeast Florida INTERCEPT Task Force ⭐ Melissa Stroebel - Vice President of Research & Insights at Thorn  ⭐ Nicki Reisberg - host of “Scrolling 2 Death,” a podcast for parents worried about social media   Reserve your seat or table today: https://lnkd.in/ez429aXa All proceeds from the Spring Luncheon will support our mission to inspire, promote, and develop solutions to end sexual abuse, exploitation, and violence against children. Mary L. Pulido, Ph.D. #WorldChildhoodFoundation #ProtectOurChildren #onlinegaming

    • No alternative text description for this image
  • View organization page for Thorn

    32,565 followers

    How can philanthropy, policy, and technology work together to protect children? Thorn brought together global leaders for a powerful webinar about working to protect children in the digital age. In “Hope, Vision & Impact: Leaders Shaping the Future of Child Safety,” Julie Cordua (CEO at Thorn), Brigette De Lay (Director of the Prevent Child Sexual Abuse Programme at Oak Foundation), Julie Inman - Grant (Australia’s eSafety Commissioner), and Sara Clemens (Interim Thorn Board Chair) share insights on the challenges and opportunities ahead. The discussion explores: 🤝 Cross-sector solutions to accelerate change 💡 Philanthropy’s role in long-term impact 🌏 Global insights on online safety 📱 The future of child protection in 2025 Watch the on-demand webinar and learn how to turn your passion into impact. https://lnkd.in/eDr8cvyr

  • View organization page for Thorn

    32,565 followers

    Between 2021 and 2023, reports of online enticement—the category that includes sextortion—more than tripled. 📈 44,155 reports in 2021 📈 80,254 reports in 2022 📈 186,819 reports in 2023 Sextortion can happen on any platform with messaging—social media, dating apps, gaming sites, and texting apps. For too many young people, shame and fear have kept them from getting help in these situations. In fact, 85% of teens and young adults say embarrassment is a barrier to reaching out. Support from parents matters a great deal—and having open, judgment-free conversations with your child can make all the difference. Thorn has designed a guide specifically to aid parents here. Our guide gives you the tools you need to have an open dialogue about sextortion, create a safe space, and show your child that you'll always be there for them, no matter what. Start the conversation today. Read the guide here: https://lnkd.in/eMUYwWgB

    • No alternative text description for this image
  • View organization page for Thorn

    32,565 followers

    Making friends online is often a normal part of growing up today. It’s where many kids connect, share, and find support. But predators know this too—and they look to exploit it. 2 in 5 kids have been approached online by someone they thought was attempting to "befriend and manipulate" them. Online grooming isn’t always obvious. It often starts with trust—shared interests, personal stories, and secret-keeping. But predators can just as easily approach quite suddenly and out of the blue. 40% of minors have received requests for nudes from someone online with whom they’ve never had a previous interaction. For teens aged 13-17, that number jumps to nearly half (48%). Groomers use fear and shame to keep kids silent, leveraging tech to reach more victims faster than ever before. This isn’t just an issue for parents. Platforms, tech leaders, policymakers, and communities must work together to protect kids from exploitation in the digital spaces they call home. 🔹 Awareness is the first step. Share info you learn and spread the word. 🔹 Education is key. Talk to kids about online safety and recognizing red flags. 🔹 Action is needed. Support organizations that are fighting to end child exploitation online. Together, we can create a safer internet and protect our children now.

  • View organization page for Thorn

    32,565 followers

    Thorn’s Dr. Rebecca Portnoff shares crucial insights in Navigating AI, a collection of expert perspectives published on Technically Optimistic by Raffi Krikorian and the Emerson Collective. At Thorn, we’re focused on safeguarding children in the GenAI era—because building responsible AI isn’t just about innovation, it’s about protection. This piece dives into: - Why testing & verification are critical for AI safety - How AI must be designed with the full system in mind - Why good intentions aren’t enough—we need action Read more below. 👇 #AIforGood #ChildSafety #ResponsibleAI #TechLeadership

    View profile for Dr. Rebecca Portnoff

    MIT Tech Review 35 | Fast Company AI 20 | Head of Data Science @ Thorn | AI & Child Safety | Princeton, UC Berkeley

    I’m excited to share this essay I wrote for Navigating AI (https://lnkd.in/e23bnQT7?), published on the Technically Optimistic substack by Raffi Krikorian with the Emerson Collective. Navigating AI is a collection of articles by technologists from various sectors, exploring key AI topics and providing practical guidance for tech leaders. I was glad to get the opportunity to dig into several important topics - the role of testing and verification in building effective AI, the way in which the full system has to be taken into account when building AI, and how good intentions are a starting point, not an ending point. I hope you enjoy the read! 

    • No alternative text description for this image

Affiliated pages

Similar pages

Browse jobs

Funding

Thorn 2 total rounds

Last Round

Grant

US$ 345.0K

See more info on crunchbase