this post was submitted on 21 Oct 2023
133 points (97.8% liked)

Technology

57350 readers
4595 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ericisshort@lemmy.world 7 points 10 months ago (1 children)

But humans make works that are similar to other works all the time. I just hope that we set the same standards for AI violating copyright as we have for humans. There is a big difference between derivative works and those that violate copyright.

[–] lemmyvore@feddit.nl 3 points 10 months ago (3 children)

Doesn't this argument assume that AI are human? That's a pretty huge reach if you ask me. It's not even clear if LLM are AI, nevermind giving them human rights.

[–] ericisshort@lemmy.world 4 points 10 months ago (1 children)

No, I’m not assuming that. It’s not about concluding AI’s are human. It’s about having concrete standards on which to design laws. Setting a lower standard for copyright violation by LLMs would be like setting a lower speed limit for a self-driving car, and I don’t think it makes any logical sense. To me that would be a disappointingly protectionist and luddite perspective to apply to this new technology.

[–] lemmyvore@feddit.nl 0 points 10 months ago (1 children)

If LLM are software then they can't commit copyright violation, the onus for breaking laws falls on the people who use them. And until someone proves otherwise in a court of law they are software.

[–] ericisshort@lemmy.world 3 points 10 months ago

No one is saying we charge a piece of software with a crime. Corporations aren’t human, but they can absolutely be charged with copyright violations, so being human isn’t a requirement for this at all.

Depending on the situation, you would either charge the user of the software (if they directed the software to violate copyright) and/or the company that makes the software (if they negligently release an LLM that has been proven to produce results that violate copyright).

[–] Saganastic@kbin.social 3 points 10 months ago (1 children)

Machine learning falls under the category of AI. I agree that works produced by LLMs should count as derivative works, as long as they're not too similar.

[–] nybble41@programming.dev 2 points 10 months ago

Not every work produced by a LLM should count as a derivative work—just the ones that embody unique, identifiable creative elements from specific work(s) in the training set. We don't consider every work produced by a human to be a derivative work of everything they were trained on; work produced by (a human using) an AI should be no different.