this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

55744 readers
3550 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] KairuByte@lemmy.dbzer0.com 3 points 7 months ago (1 children)

So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?

[–] azertyfun@sh.itjust.works 1 points 7 months ago (1 children)

Go after the people advertising those apps. Developers and advertisement agencies who say/intentionally imply "create naked pictures of people you know" should all be prosecuted.

Unlike photoshop or generic SD software, these apps have literally no legitimate reason to exist since the ONLY thing they facilitate is creating non-consensual pornography. Seems like something that would be very easy to criminalize.

[–] KairuByte@lemmy.dbzer0.com 2 points 7 months ago

So wait, we can’t criminalize the use, but if we criminalize the advertisement it fixes the situation?

You realize the exact same problem exists? There are plenty of tools with illegal uses, easily accessible online right now. Many on GitHub.