“Many girls were completely terrified and had tremendous anxiety attacks because they were suffering this in silence,” she told Reuters at the time. “They felt bad and were afraid to tell and be blamed for it.”
WTF?!
This is a most excellent place for technology news and articles.
“Many girls were completely terrified and had tremendous anxiety attacks because they were suffering this in silence,” she told Reuters at the time. “They felt bad and were afraid to tell and be blamed for it.”
WTF?!
Spain is a pretty Catholic country, and even if religious attendance is dropping off, the ingrained beliefs can still remain. Madonna/Whore dichotomy still is very prevalent in certain parts of society there.
Psychology 101.
Welcome to Christianity.
If a man sexually exploits a woman, it's the woman's fault for leading him astray.
This is how women are treated in deeply Christian communities.
Those women fear stepping forward to report assault or abuse because there are many in their community that will condemn them for it.
Welcome to religion at large
I read the headline and said oh come on. One paragraph in and that turned to what in the absolute fuck.
Are you surprised by teenage boys making fake nudes of girls in their school? I'm surprised by how few of these cases have made the news.
I don't think there's any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.
I'm not sure you can teach boys not to be horny teenagers 😜
Having been a teenage boy myself, I wouldn't dream of trying.
But I knew it wasn't OK to climb a tree with binoculars to try to catch a glimpse of the girl next door changing clothes, and I knew it wasn't OK to touch people without their consent. I knew people who did things like that were peeping toms and rapists. I believed peeping toms and rapists would be socially ostracized and legally punished more harshly than they often are in reality.
Making and sharing deepfakes of real people without their consent belongs on the same spectrum.
Being horny is one thing, sharing this stuff another. If whoever did the fake would've kept it to themselves, then nobody would've even known. The headline still is ass and typical "AI" hysteria though.
We do eventually grow up at least
... into horny men
... but hopefully with a little more empathy and propriety.
There are always two paths to take - take away all of humanity’s tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.
No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it,
Hydraulic press channel guy offended you somehow? I'm missing something here.
No, just an example. But if you’ve ever noticed the giant list of safety warnings on industrial machinery, you should know that every single one of those rules was written in blood.
Sometimes other bodily fluids.
The machines need to be oiled somehow.
🤨 vine boom
Either Darwin awards or assholes, most likely. Those warnings are written due to fear of lawsuit.
I don't think they're offended. I think they're saying that a tool is a tool. A gun or AI are only dangerous if misused, like a hydraulic press.
We can't go around removing the tools because some people will abuse them. Any tool can kill someone.
We could also do a better job of teaching people from childhood not to be assholes.
Guns do not belong in the list. Guns are weapons, not tools. Don't bother posting some random edge case that accounts for approximately 0.000001% of use. This is a basic category error.
Governments should make rules banning and/or regulating weapons.
Weapons are tools, by strict definition, and there are legitimate uses for them. Besides, my point was that they should be regulated. In fact, because they are less generally useful than constructive tools, they should be regulated far MORE strictly.
It's like these x-ray apps that obviously didn't work but promoted to see all the women naked. Somehow that was very cool and no one cared. Suddenly there is something that kinda works and everyone is shocked.
They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.
This is the best summary I could come up with:
A court in south-west Spain has sentenced 15 schoolchildren to a year’s probation for creating and spreading AI-generated images of their female peers in a case that prompted a debate on the harmful and abusive uses of deepfake technology.
Police began investigating the matter last year after parents in the Extremaduran town of Almendralejo reported that faked naked pictures of their daughters were being circulated on WhatsApp groups.
Each of the defendants was handed a year’s probation and ordered to attend classes on gender and equality awareness, and on the “responsible use of technology”.
Under Spanish law minors under 14 cannot be charged but their cases are sent to child protection services, which can force them to take part in rehabilitation courses.
In an interview with the Guardian five months ago, the mother of one of the victims recalled her shock and disbelief when her daughter showed her one of the images.
“Beyond this particular trial, these facts should make us reflect on the need to educate people about equality between men and women,” the association told the online newspaper ElDiario.es.
The original article contains 431 words, the summary contains 181 words. Saved 58%. I'm a bot and I'm open source!
What does this have to do with the equality of men and women? Girls are more at risk of this kind of abuse? That's a good point, but it's not brought up here. This parent is trying to make something political that is simply not. Not that gender equality should be political in the first place.
Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn't differentiate between nudes and sand dunes back when they enforced their new policies.
This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.
Next they'll ban E2EE and libre software.