• Stefen Auris@pawb.social
    link
    fedilink
    English
    arrow-up
    44
    ·
    2 days ago

    “make it work”

    It already does. I find enjoyment on this interconnected platform. Perhaps by “make it work” they mean make it work for massive monetization which is what we are trying to escape from in the first place. Not everything has to be about money!

    • HobbitFoot
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Or just make it work at scale.

      Lemmy at a million users would look far different than it does now.

    • MostlyBlindGamer@rblind.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      Right, as an instance admin, I’m very confident I can avoid Facebook’s mistakes. I don’t have the same motivations.

    • Chris Remington@beehaw.orgOPM
      link
      fedilink
      arrow-up
      25
      ·
      2 days ago

      The last sentence of the article clears this up.

      That will mean users putting in the work to make sure they remain safe, accessible, noncommercial and well respected.

  • flamingos-cant@feddit.uk
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    2 days ago

    Though this content could flourish in pockets of the fediverse, the scary scenario of prevalent child sexual abuse material is not the case. There are many moderation tools, including shared blocklists, that prevent it. However, the idea that the fediverse is full of harmful content was used by Elon Musk to justify his anti-competitive decision to block links from X to Mastodon.

    Didn’t he unban someone who posted one of the worst CSAM videos known?

      • Nollij@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        10
        ·
        2 days ago

        There are some public numbers on how many occurrences are found each year on the major platforms.

        IIRC, Facebook deals with around 75 million reports per year. Twitter, Reddit, and others were around 20 million reports per year.

        I don’t know how many are dealt with on Mastodon or Lemmy (or how you’d even get reliable numbers for that), but something tells me it’s a lot less than the bigger platforms these days.

        • Lime Buzz (fae/she)@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          We have heard of some here and there. The biggest problem is instances with open signups, they’re the ones that tend to get CSAM. That and instances that see nothing wrong with ‘lolicon’.

        • Chris Remington@beehaw.orgOPM
          link
          fedilink
          arrow-up
          5
          ·
          2 days ago

          In the four years that I’ve been an admin here I’ve only seen one CSAM case. I don’t want to see another one. It was very difficult dealing with it on a personal level.

          • elfpie@beehaw.org
            link
            fedilink
            arrow-up
            2
            ·
            11 hours ago

            I’m very sorry for you. People might not realize how traumatizing having to deal with it can be. It definitely shouldn’t be the responsibility of people without proper support or training.