Hi!

A bit of background/motivation: Sharing photos of protests can be an important part of the PR of political organizations. However, not everyone feels safe sharing their faces in connection to political organizing. That’s why usually, faces are pixellated, or people wear face covering masks (which might be illegal on protests in some juristictions). Pixellated/hidden faces are quite ugly to normies, though, which can reduce the effectiveness of the publication.

So I had this idea: What if instead of pixelating the faces, I run some CV software on the image and all the faces get swapped with the faces of Hedy Lamarr, Diego Luna, or JC Denton. I remember that Snapchat could do live faceswaps with the selfie cam ten years ago, so some desktop software like that shouldn’t be too hard to find in 2025, right? /j

Unfortunately, all the stuff I managed to find was some computer science projects in which you train some monster model with one hell of a dataset of each face you want to replace/emplace (which defeats the purpose of anonymizing political activists). Or some obnoxious AI startup which is waaaaay too busy sucking off Elon Musk and/or Sam Altman. I don’t want to give my money/data to some doomed AI startup which ends up selling our likenesses to the NSA.

TL;DR: Is there some kind of desktop software which detects faces in an image and swaps them with another face? It’s ok if there’s only a framework (as long as it’s not as bad as all the horrible OpenCV results you find in online tutorials).

Edit: I found something that I can work with

  • Prunebutt@slrpnk.netOP
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    17 hours ago

    How do I want to “endanger people” if I want to hide the faces of people in my org? WTF is wrong with you?

    • outhouseperilous@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      17 hours ago

      Okay so that’s people making videos of themselves on purpose to associate their identities? And nobody who didn’t opt in? Then why does any measure of caution apply? What am i missing?

      • Prunebutt@slrpnk.netOP
        link
        fedilink
        arrow-up
        2
        ·
        17 hours ago

        What am i missing?

        Basically every bit of context.

        I asked about software to faceswap photos for when my org wants to publish a pic were everyone who is on it doesn’t have to pixellate the faces, but rather faceswap the faces with other people (generated faces, historical figures, etc.).

        I’d like to try that since everytime my org wants to take a photo (e.g. for showing international solidarity on social media), an argument arises of whether or not to pixellate the faces. Some people want their identities protected, other people think that pixellated faces damages public perception of the org.

        How is that relating to anything you say?