this post was submitted on 28 Sep 2023
141 points (85.8% liked)

No Stupid Questions

34333 readers
3502 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS
 

There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings' innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone's face into a Porno movie too.

It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

top 50 comments
sorted by: hot top controversial new old
[–] Veraticus@lib.lgbt 180 points 9 months ago* (last edited 9 months ago) (6 children)

I'm only going to answer the first part of your question, not the AI/generated part.

No one really chooses what or who they're attracted to; it kind of just happens to you. For example, you might be watching a TV show and someone gets lightly, comically spanked... and suddenly a light bulb goes off above your head and you think, "whoa, that might actually be kinda fun." People are wired in ways we don't understand to want things we don't even know we want.

To that extent, pedophiles are themselves victims of their own desires; there's no "logic" behind it. It's simply an urge they experience.

Of course that doesn't make succumbing to this urge excusable, and any children who are impacted are of course victims and the pedophiles, predators. But no one is training pedophiles in pedophile camp. It's just humans being human, unfortunately.

[–] HappycamperNZ@lemmy.world 84 points 9 months ago (1 children)

This is something many people fail to realize - while society hates it exists, it is just an urge the same as my desire for women. We have just grown as a society to say this isn't right (correctly). There are many who have the urge, and don't follow up on it but its still there and they are a victim as well.

Fully agree though, this does not excuse those who act upon it, promote it or sell it.

[–] CaptainEffort@sh.itjust.works 33 points 9 months ago (2 children)

Unfortunately I think it’s probably in the same vein as any fetish or preference, so completely out of their control.

Obviously people who act on it are the scum of the earth, but those who simply battle with the urge I have nothing but sympathy for. I can’t even imagine how horrible it is to have to deal with that daily and never be able to do anything about it, or even really talk to anyone about it.

load more comments (2 replies)
[–] Fredselfish@lemmy.world 54 points 9 months ago (3 children)

I have heard that kids that molest kids or end up fooling around a super young age can make them grow up wired to be attracted to young kids or teens.

The real issue no one wants to address is people who have these desires and know it wrong have nowhere to turn to for help.

Even If they haven't abuse anyone come out tell a therapist or someone you are attracted to kids will and probably can get you locked up. There are those who never affend but a lot do because either a) they accept what they are and have no moral objections to it or b) can't get the help needed to fight the urges and end up offending.

As @Veraticus said there is no easy answer because it's not a choice. Be like asking you why you like women or why people are gay. They wired that and unfortunately I don't think you can cure it.

We definitely need to address access to any kind porn of it and if someone offends we must lock them away for their own good. Not saying prison but somewhere they can be mentally evaluated.

[–] Instigate@aussie.zone 28 points 9 months ago (1 children)

There is definitely a link between having experienced sexual abuse as a child without any therapy or counselling to help them make sense of it and then later on sexually abusing other children, but it’s not super clear-cut and definitely not predictable.

[–] Fredselfish@lemmy.world 16 points 9 months ago

Yes same with girls who are raped or molested become promiscuous but doesn't mean all girls in that situation will. Definitely why we need better sex education in America so we can teach kids the signs if an adult being inappropriate and learn about their own bodies.

load more comments (2 replies)
[–] DingoBilly@lemmy.world 7 points 9 months ago (4 children)

This is the most accurate answer, and the fact it's all cultural/social is quite important as well.

If you were born a few thousand years ago it may be completely reasonable to sleep with a kid. Hell the kid is probably your slave so you could literally do whatever you want with them.

But just as I don't understand certain fetishes or even just people attracted to the same sex, others won't understand why people would be attracted to kids.

load more comments (4 replies)
[–] XbSuper@lemmy.world 7 points 9 months ago

This is one of the most sane responses I've ever seen.

I am one of those poor souls who has these urges, but has never, and will never, act on them.

I'm willing to open myself to an AMA for anyone interested.

load more comments (2 replies)
[–] Deestan@lemmy.world 45 points 9 months ago* (last edited 9 months ago)

To answer some of the questions:

I cannot understand the attraction to kids.

There was a TV interview with people who were seeking help for pedophilia. They described it as just plain horny sexual attraction that they knew they had to not act on. I guess people have different reasons, and some probably manage to rationalize it as "relationships" as you say.

Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

Whether it is a modified image of a real person, or a pure generated picture, they will fall under the same laws as for depicting it which is already uncontoversially illegal.

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

Hard, as there are many ways to describe nudity or encourage the generator to weigh towards nudity. "Person with visible thighs, no skirt" and such.

Easier to leave nudity out of the training data, which is already common.

Then hard again because anyone can throw together a new image generator trained on what they want and no word filters.

[–] dameoutlaw@lemmy.ml 32 points 9 months ago (1 children)

I see people here attempting to equate it with a natural attraction and or a fetish. We all have internet access and can look it up. It is in the DSM-5 and ICD-10 both pedophilia and pedophilic disorder. Especially, in this more advanced age of medicine, science and society. If it was natural I believe the corrections would’ve been made and or strongly advocated for, it’s not. What is advocated is using terminology correctly, encouraging those that experience this to feel comfortable to tell their truth and seek help. I believe some of you guys comments are very dangerous and some of the upvotes and downvotes are concerning and makes it difficult to distinguish if you are in support of protecting children. The point is please don’t just blanket label it and compare it to things that aren’t harmful, illegal and consensual.

load more comments (1 replies)
[–] VelvetStorm@lemmy.world 30 points 9 months ago* (last edited 9 months ago) (1 children)

Edit: also don't be sexist, call it what it is rape, not seduction. Just because it was a woman doesn't make it not rape. Calling it anything else is doing a disservice to all of the male victims of female on male rape.

If the AI porn is depicting real life underage people then it should be and is(in some places) illegal. Now I don't like it and find it reprehensible but if it's not depicting real-life people then it should be legal.

I think if you are into that then you should be able to seek professional help without a fear of it ruining your professional and personal life, but if you are attracted to kids then it is your moral responsibility and obligation to not work with or be around kids.

This is a mental issue and it should be treated like one and we should be trying to understand it and find ways to prevent and treat it.

[–] snausagesinablanket@lemmy.world 10 points 9 months ago* (last edited 9 months ago)

I was reading this, and it made me remember how a dude in Australia I believe bought an underage sex doll. It ended up being flagged somehow, and the government arrested the guy when it arrived. I have no idea what happened to the guy.

Was this guy trying to control his urges by using this hunk of rubber, or is that a crime too? This is very edgy, and I am thinking some countries won't bother with them, while others might incur the death penalty.

[–] HelixDab2@lemm.ee 27 points 9 months ago (6 children)

There are multiple parts to your question. I'm going to try to break it down.

First, there's a difference between a pedophile and a child molester. Pedophilia is a sexual attraction to children, but it does not, by itself, require the person to take action. A child molester is a person that sexually assaults children. It's the difference between being heterosexual, and being a rapist; you can be straight and still be entirely celibate.

Child molesters may not be sexually attracted to children at all; some might be, but people that commit rape aren't usually doing it solely for sexual gratification, although sex is definitely part of it.

We don't know how common pedophilia is because of how heavily stigmatized it is.

You don't understand how a person could be sexually attracted to children; the simplest way to explain it is to ask if you can understand how a man can be sexually attracted to another man. IIRC, most research indicates that pedophilia probably is a sexual orientation, much like being straight, gay, or bisexual is (except that there is no moral or ethical way for a pedophile to have a sexual or romantic relationship with a child; that is always both predatory and criminal). Do pedophile child molesters believe that they're having a relationship? Some of them, yes. They're able to delude themselves into believing that the child wants the attention and sex (really sexual assault), when they're--probably--the one that has groomed the child in the first place.

I cannot understand the attraction to kids. Even teens.

I can. When I was a child, I was sexually attracted to my peers. 14yo kids are having sex with each other, so clearly they're attracted to each other. As an adult, I can see women in their 20s as being sexually attractive, while still having zero interest in them (y'all seem really young, and not in a good way, if y'know what I mean). Sexual maturity isn't a magical thing that happens when you hit 18 (or whatever the age of consent is where you live); it's a sliding scale.

Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

I don't think that you can make a person into a pedophile, any more than you can make a person gay. A person either is, or isn't, a pedophile, and CG CSAM isn't going to change that. So the question is, does CG CSAM make it more likely that a pedophile will end up sexually abusing a child? My intuition says that it will not, in the same way that the proliferation of pornography has not made sexual assault of adults more common. (Some research indicates that the availability of pornography has decreased rates of sexual assault.) Child pornography is illegal--in part--because it cannot be produced without causing real harm to children. CG CSAM doesn't cause real harm to any person though; unless there's evidence that it increases the rates of child sexual abuse, I don't think that the squick factor is a reasonable basis for banning it. OTOH, adult pornography has generally led to a relaxation of sexual mores and norms--which I believe is a generally positive thing--and it's possible that CG CSAM would normalize child sexual abuse sufficiently that libertarians would be able to severely weaken age of consent and statutory rape laws. I don't really know, TBH; I'd want to see more research rather than reflexively banning it.

load more comments (6 replies)
[–] cooopsspace@infosec.pub 23 points 9 months ago* (last edited 9 months ago)

At the end of the day, art is just pixels on a flat surface. Determining whether a depicted individual is under age where it's not obvious sets a dangerous precedent. Is the picture 17 or 18? Who knows.

But the problem is that people have been sexualising people like Emma Watson since she first appeared on screen. That's not okay and rather than sending AI art underground I think society needs to change to normalise education about sex, reproduction and genitalia and address the social issues to treat pedophilia like the disease that it is.

Meanwhile pedophile names are being written about publicly, risking mob violence and further isolation. Not to mention in the US theres a lot of negative attention being put on women's reproduction, childrens sex ed and genitalia and a push to make the whole lot illegal and taboo. Not to mention people teaching their kids pet names for their parts, "uncle Ben touched my heehaw" sounds a lot different to "uncle Ben touched my penis".

Society is a problem, the US particularly is going in the wrong direction on many aspects of sex education.

[–] Izzgo@kbin.social 22 points 9 months ago (2 children)

To me there is a clear difference between children, and teens say 16+. It is both morally wrong and unnatural to be attracted to prepubescent children, and this is pedophilia. But basically, by definition puberty makes people become sexually attractive, and it's natural for adults to be attracted. Still morally wrong to act on those attractions unless you're in about the same stage of puberty or early adulthood. That's when we rely on a strong moral code and laws in society to protect youngsters who have recently gone through puberty. And hopefully even after the laws no longer apply, we have enough societal pressure to strongly discourage wide age gaps between sexual partners.

Pedophilic disorder is characterized by recurring, intense sexually arousing fantasies, urges, or behavior involving children (usually 13 years old or younger).

[–] Astroturfed@lemmy.world 22 points 9 months ago

Don't try to understand it. You aren't going to get a good answer. It's a horrible mental illness level of sexual preference.

Anything can be sexualized with enough impulse and experiences. Everyone's got some weird dark fetish shit. Some of it's illegal in practice. Normal people bury that shit or only discuss it in therapy. While talking it out so they can hopefully never think about it again.

I'm sure there's different answers to this just like "why are there serial killers?". Just be glad it confuses you.

[–] Modern_medicine_isnt@lemmy.world 19 points 9 months ago (1 children)

My assumption on the attraction thing is that there are many things that cause attraction. Guys generally go for younger women. Well what do younger women have, tighter bodies, firmer breasts, more fit, healthier looking hair and skin, more defined hips... but as we all know many guys specialize in being attracted to a few of these things. Well, children usualy have healthy skin and hair... so if a guy is attracted to just the attributes that don't require puberty, I can imagine that attraction wise he might not feel such a difference. Now mix that in with wanting to feel superior and some of the other things like that and kids start to fit well. Now add in a high libido and low self control. Disaster. Take the same guy and add in an attraction to big boobs and he is close to average cause kids don't have those.
Basically, it only takes a few missing screws. As for women. Teenage boys have a lot of sexual energy and passion. I can imagine that being attractive. Plus there is of course the taboo of it that appeals to women just like any other kink. Put them in a space where they aren't getting thier needs met by men, and give them access to boy. Disaster again. In the end, the diversity of humans means there will always be someone into anything you can imagine.

[–] Not_Alec_Baldwin@lemmy.world 9 points 9 months ago (4 children)

I think you're missing the point, at least as far as I understand it.

Child predators experienced some kind of trauma, and as a result they never developed. That could be external trauma (abuse) or internal trauma (thoughts, mental illness) but as their body "grew up" and they began developing sexual urges, they never matured.

Think about your first crush. They were your age, probably, unless you had a crush on Jennifer Connelly like every other millennial boy. As you grew up your crushes were probably always within a few years of you. It's just how it works.

In "minor attracted people" (I hate that term, but it works for criminals AND non-criminals so it's valid) the attraction just doesn't get updated.

Humans are REALLY BAD at controlling our impulses. Especially impulses that are taboo. Especially biological impulses (eating, sex, learning, games, etc).

So these people go into fields where they can be around kids. And then an "opportunity" arises and they either can't fight their criminal impulse or they rationalize their criminal behavior.

And boom. Traumatized kids and teachers in jail.

We need to get more people therapy.

[–] Delphia@lemmy.world 14 points 9 months ago (1 children)

In Germany i think they have a program where you can come forward and say "Im attracted to minors and I want help before I get myself in trouble" and they get you help.

[–] Not_Alec_Baldwin@lemmy.world 7 points 9 months ago

I hadn't heard that but it seems like the right kind of direction to be headed.

load more comments (3 replies)
[–] scarabic@lemmy.world 13 points 9 months ago

I imagine there’s some kind of vampiric quality to it. Kids are full of youth and innocence: things we are all constantly losing to time. Especially if someone’s own childhood was robbed from them, I think they will carry around a void they desperately want to fill but never can, because of course abusing a child doesn’t bring these things back to you. Still, many child abuse victims go on to abuse other children later in life, and this may be their drive: to seek the thing they lost. It’s beyond sad. Abusing children is straight up disgusting and terrible, but the convoluted desperation that causes people to do it is truly horrifying in a stranger-than-any-fiction kind of way.

[–] Apepollo11@lemmy.world 12 points 9 months ago* (last edited 9 months ago) (3 children)

I'm only going to tackle the tech side of this...

How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?

Easy. The most popular apps all filter for keywords, and I know that at least some then check the output against certain blacklisted criteria to make sure it hasn't let something slip through.

But...

Anyone can host their own version and disable these features, allowing them to generate whatever they want, in the exactly same way that anyone can write their own story containing whatever they want. All you need is the determination to do it, and some modicum of ability.

People have been been creating dodgy doctored photos long before computers. When Photoshop came out, it became easier, and with AI it's easier still. The current laws about creating and distributing indecent images still apply to these new images though.

load more comments (3 replies)
[–] ThisIsAManWhoKnowsHowToGling@lemmy.dbzer0.com 11 points 9 months ago (1 children)

There are two parts to this problem.

For kids who haven't hit puberty, there is a diagnosable pedophilia disorder. This is mostly genetics. (I'm pretty sure I've met an alpaca that was a pedophile once.) The molester's brain is wired wrong. Nothing to do about that. IMHO, they deserve pity as long as they keep their hands off the children.

For teenagers, the attraction is the power dynamic. Teens have a rather distorted view on what is attractive, and they tend to be naive and easily manipulated. On top of this, almost all teenagers have next to no impulse control, and many will make very very bad decisions (even knowing that the decision is bad) if doing so might result in some form of dopamine hit via sex/adrenaline rush/video games/peer approval/etc. Adults that seek out teenagers for sexual relationships are bad people who chose to be a groomer. There is no genetic component to being a groomer, and they don't deserve pity.

Btw, I can flesh out my claim about the alpaca if you want, but it will have to have a tw for adorable fluffy animals suffering a horrifically slow and painful death.

[–] Adalast@lemmy.world 5 points 9 months ago

Info link: https://pubmed.ncbi.nlm.nih.gov/18686026/

The DSM-V specifies 2 types of pedophilia, Pedophilic (victim age <11) and Hebophilic (victim ages 11-14). What you are describing for the grooming is generally not pedophilia because "children" older than 15 are generally considered post-pubescent and thus anatomically adults. Their frontal lobes still have a LOT of time needed to cook to completion, but they have the impulse control issues for a reason, from an evolutionary standpoint. Yes, in modern society, "adults" who take advantage of the still-developing prefrontal cortex of a post-pubescent adolescent is a shit human being who doesn't deserve to be a member of society, but they are technically not pedophiles, at least not clinically. Legally is a different story, but that is not a pertinent area of discussion right now.

Pedophilic and Hebophilic individuals generally do not ever take their impulses to the realm of reality. Most of them actually end up feeling so much shame and remorse over even having the thoughts that they commit suicide. They definitely deserve pity and treatment, not stigmatization and ostracization.

As to the OP asking about AI art that depicts underage individuals in states of undress or sexual situations, ALL depictions of underage individuals in those contexts are illegal. By the letter of the law, if you draw stick figures on a piece of paper having sex, then label them as children, you have created child pornography. No depiction is legal, no matter the medium. AI-generated, hand drawn, sculpted, watercolors, photos, under the law in (I believe) every state, they are all identical. Personally, I believe that this is asinine and 100% indicates that the purpose of these laws are to adjudicate morality, not "protect the children" as all of the people who push on them claim, but that is just my opinion. Hand-drawn artwork that has no photographic source material and does not depict real people has virtually 0 chance of having caused harm to any children, and AI just knows what the keywords mean in the context of reversing the vaporization of an image. They weren't trained on kiddy porn, the we're trained on pictures of children, and pictures of adults doing their porny thing, so they are able to synthesize the two concepts together.

[–] Candelestine@lemmy.world 10 points 9 months ago (2 children)

I don't know many of the answers to these questions, I'm no social scientist or doctor. On the tech side of things, this is all very new and we are still coming to grips with it.

I feel pretty comfortable fielding this one though: There should be no exceptions granted for AI generated pornographic content, and a person's facsimile should have the same protections as actual photos of them. I do not think many people will find this controversial among the general public, as it would pretty clearly serve to protect all of us from having our identities used against our will.

I expect that even our congress should be able to get at least something on the books in this direction. Eventually. Maybe. Or at least the FCC or something.

[–] Delphia@lemmy.world 26 points 9 months ago (9 children)

Let me preface this by saying I DO NOT SUPPORT CSAM!

The only issue I take with AI generated images is that theres no true "age" to the picture and any legislation that would allow people to be jailed or charged would have to be worded very carefully.

"Depicting clearly underage subject matter if it were a person or using prompts to generate someone who clearly appears under aged" simply because someone could be marked for life for typing in "naked elf" and the program spits out something with small boobs and childlike features and not having their HD shredded immediately.

[–] cedarmesa@lemmy.world 22 points 9 months ago* (last edited 9 months ago) (3 children)
[–] Delphia@lemmy.world 18 points 9 months ago (1 children)

Thats exactly my point. Sure the courts may rule in your favor eventually but you just got marched out of work in handcuffs for possession of CSAM, your entire personal and professional circle knows and any explanation you offer is going to sound like total bullshit.

"It was an AI generated image, and it was an elf! She just looked young, but not like illegal young! Guys you have to believe me!"

load more comments (1 replies)
load more comments (2 replies)
load more comments (8 replies)
load more comments (1 replies)
[–] Ganbat@lemmyonline.com 8 points 9 months ago* (last edited 9 months ago)

Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues?

That's where things get difficult. An episode of Law & Order: SVU tried to tackle this question a long time ago (but with Photoshopped fake CSAM) and the answer was a resounding "I dunno."

On the one hand, it's disgusting, deplorable, etc. On the other, a fake image means no one was victimized for it.

Does the content further radicalize these people, creating further risk of them victimizing a child, or does it sate their desires, helping to prevent them from victimizing a child? These questions are incredibly difficult to actually answer, and no answer can ever really be definitive, as you can't really predict how any one person might react.

[–] dameoutlaw@lemmy.ml 5 points 9 months ago (1 children)

AI CSAM should absolutely be treated as such. The model has been trained on images of real human children. I’m not sure where the issue comes from I would imagine power. Id need to check peer reviewed work from those in the field but I honestly can’t stomach it.

[–] surewhynotlem@lemmy.world 19 points 9 months ago (15 children)

What about an artist just drawing it? Is that ok?

Or no, because the artist has seen children before?

load more comments (15 replies)
[–] Maeve@kbin.social 4 points 9 months ago (2 children)

That this has two reduces is mind boggling. Op, I’ve read some stuff that suggests some abused s as children grow up to be attracted to children. Others seem to be at whatever mental age of the general age they feel attracted to. And probably no small share is just because they can.

load more comments (2 replies)
load more comments
view more: next ›