Representative take:

If you ask Stable Diffusion for a picture of a cat it always seems to produce images of healthy looking domestic cats. For the prompt “cat” to be unbiased Stable Diffusion would need to occasionally generate images of dead white tigers since this would also fit under the label of “cat”.

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I had severe decision paralysis trying to pick out quotes cause every post in that thread is somehow the worst post in that thread (and it’s only an hour old so it’s gonna get worse) but here:

    Just inject random ‘diverse’ keywords in the prompts with some probabilities to make journalists happy. For an online generator you could probably take some data from the user’s profile to ‘align’ the outputs to their preferences.

    solving the severe self-amplifying racial bias problems in your data collection and processing methodologies is easy, just order the AI to not be racist

    …god damn that’s an actual argument the orange site put forward with a straight face

    • Throwaway@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It works with other obvious stuff. Put the words “best, good, high quality” in your prompt actually makes the generated images better.

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I did not expect to get back to my laptop late on a friday and see someone “Well Akshoewally, If You Just Sing Gentle Sweet Songs To The Prompt then you get the socks you wanted”

          but I guess the orange site had a spillover and has me covered today!

          • self@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            leading to the obvious question: if putting the words “best, good, high quality” in your generative AI prompt isn’t a placebo, then why is all the AI art I’ve seen absolute garbage

            • froztbyte@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 year ago

              I forget where I saw it, but the phrase/comparison stuck with me and I think of it often: all of this shit is a boring person’s idea of interesting

              but the “just slap some prompt qualifiers on it (to deal with the journalists)” …god. it is of course entirely unsurprising to have an orange poster be so completely assured of their self-correctness to not even question anything, but the outright direct “just dress it up in vibes until they shut up”

              you just have to wonder what (and who?) else in their life they treat the same way

              • 200fifty@awful.systems
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                a boring person’s idea of interesting

                Agh this is such a good way of putting it. It has all the signifiers of a thing that has a lot of detail and care and effort put into it but it has none of the actual parts that make those things interesting or worth caring about. But of course it’s going to appeal to people who don’t understand the difference between those two things and only see the surface signifiers (marketers, executives, and tech bros being prime examples of this type of person)

                ETA: and also of course this explains why their solution to bias is “just fake it to make the journalists happy.” Why would you ever care about the actual substance when you can just make it look ok from a distance