• PrimalAnimist@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    I feel this won’t happen. If big subs continue to be dark too long, the reddit admins would simply remove the hostile volunteer mods and reopen the subs. The mods are used to being gods of their little domains. If they cross the line, they will be reminded that they own nothing. They can obey reddit or they can be replaced.

    That is what I see in the future for any mods that try to hold subs hostage indefinitely.

    • NotYourSocialWorker@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      A possible problem is that they would be forced to find new volunteers to run them. While I bet there’s many who want to be “gods” I bet it’s harder to find people who can do it well enough to run a 10+ million forum. Especially hamstringed by reddits lack of modtools.

      So sure, Reddit can remove the mods and do it multiple times but it will continuously lead to a worse experience and sooner or later an unacceptable amount of spam, hate and CP will cause the advertisers to pull their ads.

      • taj@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        This is absolutely true. There are often calls for ‘anyone want to mod’ on even smaller subs… and you know, it sounds fun to a lot of folks at first. But if you’ve ever actualy been a mod, even of a smaller community online? It loses its appeal very quickly.

          • Chewy12@lemmy.world
            link
            fedilink
            avesta
            arrow-up
            1
            ·
            1 year ago

            I’ve seen plenty of communities where it’s clear that the mods only stop by from time to time and they get by just fine, spam and malicious posts will still be a small minority. Some set automod on a shoot first, ask questions later setting where all reported comments get deleted until the mod restores them.

            I really don’t think finding new moderation will be an issue. As much as it would be nice for Reddit to be screwed over by the mods it’s going to be a non-issue for them, there’s already measures in place to prevent subreddit parking and plenty of willing volunteers.

      • PrimalAnimist@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I feel like ad revenue is not their top monetization priority personally. It’s speculation of course. But I think they are learning that the free content the users create will generate much more revenue from mega corps who want access to all of it to train emerging AIs. Data, specifically YOUR data is valuable. What posts do you look at? What do you upvote? What do you downvote? What subreddits do you subscribe to? There is a wealth of information they will monetize. This is why I think they don’t care that the little app devs can’t afford their new API pricing. They can’t give the app devs one price then think Microsoft, Google, Apple and other multi-billion dollar corps would pay a higher price.

        Again, this is just my speculation. But the suddenness and the exorbitant price means they want to act now, and capitalize on this new market while it’s good. Their terms of service specifically say everything you post, you give them a license to use, sell, or sub-license without dispute, forever. This isn’t about ads.

        • Doggylife@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          I was thinking the same thing. Probably why the timeline is so fast too with only giving people a month’s notice of the API costs. And could also be true of twitter.

          ChatGPT and other LLMs are gaining a lot of value from information freely available online and sites with large user generated text submissions like Reddit/twitter want a piece of the pie.