A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 months ago

    Maybe liability or pretending to help? That way they can claim later on “we care about people struggling with this issue which is why when they search for terms related to it we offer the help they need”. Kinda how if you search for certain terms on Google it pops up suicide hotline on top.

    Ok Google just because I looked up some stuff on being sad in winter doesn’t mean I am planning to put a gun in my mouth.

    • _cnt0@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Yah, this feels more like a legal protection measure and virtue signaling. There’s absolutely no assessment of efficiency or even efficacy of the measures. At least not in the article or the ones it links to and I couldn’t find anything substantial on it.