this post was submitted on 31 Jul 2023
397 points (100.0% liked)
Technology
37801 readers
221 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They don't work. It's total bunk.
Exactly. See above. No one can (confidently) tell which is which. There's just educated guessing.
K so you ignore the entire point in my post that the onus is on the creator to prove they have copyright and just point out it's hard to figure out which content to steal?
The creator has a copyright if the relevant authorities have granted the copyright registration to them, that is all they need to prove.
Copyright isn't registered anymore, it's granted on creation in almost all jurisdictions that matter. It's not like there's documentation beyond the published work.
I'll go one further - they can never work. AI is trained using a system where an artist system generates art, and a gatekeeper system gives a confidence rating of how it looks human. The artist system goes through a training process until it can consistently fool the gatekeeper system. If there was a system that existed that could identify currently generated AI art, it would become the new gatekeeper system, and the artist system would only get better.