In a two-day research period, the team discovered 112 instances of known child pornography across 325,000 posts on Mastodon. Alarmingly, the first instance of such material was identified within just five minutes of the investigation. “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close,” said David Thiel, one of the report’s researchers.

    • GeraldEstaban@exploding-heads.com
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Couldn’t agree more. Especially considering beecuck and blablaj-zone or whatever they’re called literally harbor LGBT aka some of the worst pedos imaginable. Anyone who thinks I’m lying just know that they’ve encouraged people to take their kids to drag shows, or try and encourage child mutilation though surgery or hormones, absolutely disgusting 🤮.

  • hoodlem@hoodlem.me
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The researchers employed Google’s SafeSearch API and PhotoDNA, a tool designed to identify explicit images, to conduct their investigation

    Ok, make these tools available for Mastodon to use? Whether automatic or in tools for admins or moderators.