‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • azertyfun@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 year ago

    I’ve seen ads for these apps on porn websites. That ain’t right.

    Any moron can buy a match and a gallon of gasoline, freely and legally, and that’s a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that’s already a huge win.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?

      Edit: To be clear, it’s scummy as all fuck, but still.

      • shuzuko@midwest.social
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        1 year ago

        This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn’t negatively affect them. Billy’s 13 year old classmate Stacy doesn’t have those resources and now he can do the same thing to her. It’s on a very different level of harm.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          15
          ·
          1 year ago

          Billy doesn’t need a nudify app to imagine Stacy naked. Not to mention, images of a naked 13 year old are illegal regardless.

          • azertyfun@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            11 months ago

            Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.

            Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.

            • KairuByte@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              11 months ago

              So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?

              • azertyfun@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                11 months ago

                Go after the people advertising those apps. Developers and advertisement agencies who say/intentionally imply “create naked pictures of people you know” should all be prosecuted.

                Unlike photoshop or generic SD software, these apps have literally no legitimate reason to exist since the ONLY thing they facilitate is creating non-consensual pornography. Seems like something that would be very easy to criminalize.

                • KairuByte@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  11 months ago

                  So wait, we can’t criminalize the use, but if we criminalize the advertisement it fixes the situation?

                  You realize the exact same problem exists? There are plenty of tools with illegal uses, easily accessible online right now. Many on GitHub.

          • Sweetpeaches69@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            11 months ago

            Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.

            • CleoTheWizard@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              11 months ago

              I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.

            • KairuByte@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              11 months ago

              And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯