Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?
The harm is:
No, but the harm certainly is not the same as CSAM and it should not be treated the same.
as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.
If someone fantasises about me without my consent I do not give a shit, and I don’t think there’s any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that’s different.
Hm. I wasn’t expecting the pro-child porn argument. All I can say is that’s absolutely legally and morally CSAM, and you’re fuckin nasty. Oof. Not really gonna bother with the rest because, well, yikes.