- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
Different only in construction. Why they exist and what they are is older than photography.
No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing
That is a quality improvement, not a shift in nature.
Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.
I’m not saying that it’s a shift in nature? All I’ve been saying is:
A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing
B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects
The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.
There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.