In a two-day research period, the team discovered 112 instances of known child pornography across 325,000 posts on Mastodon. Alarmingly, the first instance of such material was identified within just five minutes of the investigation. “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said David Thiel, one of the report’s researchers.

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The researchers employed Google’s SafeSearch API and PhotoDNA, a tool designed to identify explicit images, to conduct their investigation

    Ok, make these tools available for Mastodon to use? Whether automatic or in tools for admins or moderators.