this post was submitted on 22 Aug 2023
95 points (100.0% liked)
Technology
37800 readers
319 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As someone who personally wouldn't care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn't like me may have the option to generate AI porn of me having sex with a child. Now there's fake "proof" I'm a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I'm vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say "Evergreen5970 is promiscuous, don't hire them." Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I'm a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn't do.
Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.
And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn't detect AI-generated images with a perfect accuracy rate. So the question becomes "how can we trust any image anymore?" Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there'll probably always be some floating around with those guardrails turned off.
I'm also very wary of dismissing other peoples' discomfort just because I don't share it. I'm still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.