autismdragon [he/him, comrade/them]

A real autism! He/him bisexual man that might be a demiboy but probably not. I like media. Extraordinarily DILF.

  • 387 Posts
  • 2.58K Comments
Joined 4 years ago
cake
Cake day: September 8th, 2020

help-circle


  • He’s technically right that it would be hard to legislate consistently and there would be weird grey zones in doing so but when it comes to deepfakes of actual flesh and blood children which is what is being described here, this falls squarely into the not-at-all-grey area lmao. Its just pointless to bring that up. Laws being hard to enforce correctly isnt a reason to not have them inherently. Because there are things that should very obviously be off limits no matter what and deepfakes of actual flesh and blood children should obviously fall in that category.

    Like I don’t even know that this dude is an acutal pedophile lmao my read here is that he’s a weird as fuck pendant that choses incredibly strange hills to die on and isn’t even right in applying the arguments he’s trying to apply. Baffling. Like he doesn’t even think this should be legal? He just thinks it should be illegal for a different reason? Except thats not even correct because the “defemation and libel” he cites IS ITSELF ABUSE OF A CHILD like god just thinking about how absurd these arguments are is blowing my brain away .


  • This is manipulative of him and a misuse of an argument people have used about a different situation. A trusted user here once pointed out correctly that lolicon should not be called CSAM because victims orgs who created that term have been very serious about making sure that term is used correctly to describe situations in which a child was actually abused. That doesn’t make lolicon morally correct to create or consume, just that its an incorrect use of an important term.

    BUT THATS NOT THE ISSUE BEING DISCUSSED HERE. Making deepfakes of an actual flesh and blood child that actually exists is child abuse. The “defamation and libel” he describes is abusive, so its not incorrect to call it CSAM.

    Its just weird pedentry and a fucking weird ass hill to die on lmao, in addition to not being correct.


    1. You know as much as I do that legality =/= morality so don’t try to turn this into a cultural chauvinism thing or some shit. Legality wasn’t even brought up in the posts. We’re talking about right and wrong.
    2. A trusted user here has argued that lolicon, while morally wrong to produce, should not be called CSAM because an actual child was not sexually abused and victims orgs have advocated for being specific when we use the term they created (CSAM). They explained it better than me so I wont try to go any further with that. But here’s the thing. This situation is not lolicon. A child is being abused because AI is being used to create nude images of an actual physical flesh and blood child. This is not the same as that because a child is in fact being abused even if the abuse isn’t physical.
    3. Why did he find this distinction worth arguing in the first place??? Why is this a hill to die on?
    4. Noones brought it up but how is it “pro-establishment” lmao I just want that bit clarified.