Thorn

Thorn

非盈利组织

Manhattan Beach,CA 31,817 位关注者

关于我们

We are Thorn. Our mission of defending children from sexual exploitation and abuse is deeply embedded within our core—a shared code that drives us to do challenging work with resilience and determination. Here, you’ll work among the best hearts and minds in tech, data, and business, creating powerful products that protect children’s futures. Unleash your own formidable talents while learning among peers and growing every day. All in a supportive environment of wellness, care, and compassion. Build your career as we help build a world where every child can be safe, curious, and happy.

网站
https://www.thorn.org
所属行业
非盈利组织
规模
51-200 人
总部
Manhattan Beach,CA
类型
非营利机构
创立
2012
领域
technology innovation和child sexual exploitation

地点

Thorn员工

动态

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    You’re invited! Thorn’s CEO, Julie Cordua, and the founder of Liberty Law, Micha Star Liberty are coming together for an exclusive conversation about the crucial work they’re doing to combat sextortion and online grooming at scale. Join us on Tuesday, October 29, and leave with a deeper understanding of how technology, legislation, and specialized programming are key to preventing these horrific forms of child exploitation. This is a conversation that can’t be missed. Register here: https://lnkd.in/gy6hMgiF

    此处无法显示此内容

    在领英 APP 中访问此内容等

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    AI can help people create stunning artistic images and save time completing daily tasks—but it also brings a dark side. Nonconsensual, explicit AI-generated imagery is victimizing children. In a recent The Atlantic article, Dr. Rebecca Portnoff, VP of Data Science at Thorn, shares how she fights this alarming trend by working in the intersection of machine learning and combatting child sexual abuse. The research shows that 11% of kids ages 9-17 know someone who used AI to generate sexually explicit images of peers. However, there is hope. Thorn develops innovative solutions to help tackle this crisis and partners with tech companies to prevent AI misuse. It's not easy, and meaningful change can’t happen overnight, but the work of people like Rebecca proves that with layered, coordinated action, we can make a difference. Read the full article: https://lnkd.in/e_jeGtFC?

    High School Is Becoming a Cesspool of Sexually Explicit Deepfakes

    High School Is Becoming a Cesspool of Sexually Explicit Deepfakes

    theatlantic.com

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    In our digital age, it’s common for young people to form connections and even romantic relationships online. Online spaces offer incredible opportunities for connection and growth, but they also come with unique risks we must address. One serious concern is what we often refer to as “nudes.” Unfortunately, groomers may exploit emotions or use threats to coerce kids into sending these explicit images or videos of themselves. This content is more accurately termed self-generated child sexual abuse material (SG-CSAM). Even former romantic partners can betray trust by sharing consensually-shared material to shame or retaliate against the victim. Our research reveals fewer than 1 in 3 parents have discussed SG-CSAM with their children. Even if you think they know you’ll support them, having these conversations can make a big difference in them sharing their experiences with you if something goes wrong. Let’s stay committed to creating safer online environments. By engaging in regular discussions about digital safety, we can empower young people with the knowledge and strategies they need to protect themselves and seek help when necessary. Check out our resource Safe Connections: A Guide to Protecting Your Child from Online Grooming to learn more: https://lnkd.in/gK2YAi3c Together, we can ensure they enjoy the best of both worlds—online and offline—while staying safe.

    Safe Connections Guide

    Safe Connections Guide

    info.thorn.org

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    1 in 5 children aged 9-12 have experienced a sexual interaction online. Here’s how can we enhance digital literacy for children to help safeguard them online: -Integrate digital literacy into school curriculums -Offer workshops after school -Create interactive online learning tools -Provide more resources and training for educators https://lnkd.in/gJXyntmu

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    Our progress wouldn’t be possible without support from dedicated people like you. Thanks to dedicated advocates like you and our coordinated efforts with tech companies, our Safer platform has achieved incredible milestones in the fight to eliminate online child sexual abuse material (CSAM). Safer processed 71.4 billion files input by our customers in 2023, a 70% increase from 2022. Recently launched, Safer Predict enhances our existing tools with new capabilities, including the detection of text-based conversations that may indicate child exploitation. Combining cutting-edge AI technology with collaborative action across the industry enables us to tackle online child sexual abuse at scale. Together, improving detection and making strides toward a world where CSAM is eliminated from the internet. Your support fuels this progress and helps us build a safer, brighter future for every child.

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    Looking for an impactful way to support our mission and help create a safer world for children? Here are three easy ways you can take action today: -Join our community: Subscribe to our email list to stay in the know and learn the best, real-time ways to support our mission to defend children. -Learn about the issue: Gain a better understanding of how the intersection of child abuse and technology has created a public health crisis. -Share resources: Thorn for Parents offers resources and tips to equip parents for conversations with children about online safety.

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    Thorn’s CSAM Classifier identifies new and unreported child sexual abuse material (CSAM) swiftly, allowing officers to find and remove victims from harm faster. Its state-of-the-art machine learning processes more files faster than a human could do manually, transforming child protection efforts. Thorn’s CSAM Classifier not only accelerates investigations but also helps uncover the full extent of CSAM in possession, enabling prosecutors to seek appropriate sentencing that reduces the time that the abuser is out in the world potentially harming kids. Learn how this powerful solution is changing the game in child protection. https://lnkd.in/e38qeMVJ

    How Thorn Helps Investigators Find Children Faster

    How Thorn Helps Investigators Find Children Faster

    https://www.thorn.org

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    Imagine being targeted by a fake, explicit image of yourself. You’re most likely scared, confused, and worried that no one will believe you didn’t take the picture yourself. With the rise of deepfakes, this has become a terrifying reality for teens. These highly realistic fake images created with generative AI are being used in financial sextortion, targeting primarily boys ages 14-17. By raising awareness of these online risks, providing resources for support, and using technology to build safer online environments, we can mitigate the risk. We can’t let young people be responsible for protecting themselves or making the case that they’ve been harmed. By working together with a multi-layered approach, we can help youth feel safe again.

    Deepfakes are Creating a Barrier to Youth Seeking Help From Sextortion

    Deepfakes are Creating a Barrier to Youth Seeking Help From Sextortion

    Thorn,发布于领英

  • 查看Thorn的公司主页,图片

    31,817 位关注者

    Couldn’t join us live for our recent webinar, Breaking the Silence: Survivors and Parents Speak Out to Prevent Sextortion and Online Grooming? The recording of the webinar is now available for you to watch at your convenience. In the recording you will hear from: ? Pauline Stuart, Advocate and parent of a financial sextortion victim Lennon Torres, Campaign Director at Heat Initiative Rosalia Rivera, Consent educator, abuse prevention expert, and founder of CONSENTparenting? ? Learn about the real impact of online threats on children and families, practical strategies for keeping kids safe online, and how you can contribute to a safer online environment. Watch the recap now: https://lnkd.in/eXwMPYEr?

    • 该图片无替代文字

关联主页

相似主页

查看职位

融资