- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://kbin.social/m/[email protected]/t/525635
A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.
Oh no, if it isn’t the consequences of their actions.
Really shouldn’t have used training data that was obtained without consent.