this post was submitted on 26 Jul 2023
607 points (96.6% liked)

Technology

55744 readers
3550 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

you are viewing a single comment's thread
view the rest of the comments
[–] Faschr4023@lemmy.world 4 points 11 months ago (2 children)

Personally speaking, I've generated some stupid images like different cities covered in baked beans and have had crude watermarks generate with them where they were decipherable enough that I could find some of the source images used to train the ai. When it comes to photo realistic image generation, if all the ai does is mildly tweak the watermark then it's not too hard to trace back.

[–] Harrison@ttrpg.network 7 points 11 months ago

All but a very small few generative AI programs use completely destructive methods to create their models. There is no way to recover the training images outside of infantesimally small random chance.

What you are seeing is the AI recognising that images of the sort you are asking for generally include watermarks, and creating one of its own.

[–] Zeth0s@reddthat.com 3 points 11 months ago

Do you have examples? It should only happen in case of overfitting, i.e. too many identical image for the same subject