this post was submitted on 17 Aug 2024
614 points (98.4% liked)

Technology

60052 readers
2858 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] uriel238@lemmy.blahaj.zone 50 points 4 months ago* (last edited 4 months ago) (2 children)

If letting AI train on other people's works is unjust enrichment then what the record lables did to creatives through the entire 20th century taking ownership of their work through coercive contracting is extra-unjust enrichment.

Not saying it isn't, but it's not new, and bothersome that we're only complaining a lot now.

[–] FiskFisk33@startrek.website 16 points 4 months ago (1 children)

don't misunderstand me now, i really don't want to defend record companies, but

legally they made deals and wrote contracts. It's not really the same thing.

load more comments (1 replies)
[–] stellargmite@lemmy.world 5 points 4 months ago

Yep. And the streaming tech bros collusion with the industry mobsters took it to another level. The people making the art are a mere annoyance to the jerks profiting from it. And yet the ai which they think saves them from this annoyance requires the art be created in the first place. I guess the history of recorded music holds a fair amount to plunder . But art - and even pop music - is an expression and reflection of individuals and wider zeitgeist: actual humanity. I don't see what value is added when a person creates something semi unique, and a supercomputer burns massive amounts of energy to mimic it. At this stage all of supposed AI is a marketing gimmic to sell things. Corporations once again showing their hostility to humanity.

[–] Blackmist@feddit.uk 43 points 4 months ago (1 children)

It seems like it's only copyright infringement when poor people take rich people's stuff.

When it's the other way round, it's fair use.

[–] Pilferjinx@lemmy.world 8 points 4 months ago

It's like corporations and the super rich make the rules.

[–] Supermariofan67@programming.dev 36 points 4 months ago (6 children)

Copying is not theft. Letting only massive and notoriously untransparent corporations control an emerging technology is.

load more comments (6 replies)
[–] diamond_shield@reddthat.com 35 points 4 months ago (9 children)

I don't think it is relatively difficult to make "Ethical" AI.

Simply refer to the sources you used and make everything, from the data used, the models and the weights, of public domain.

It baffles me as to why they don't, wouldn't it just be much simpler?

[–] snooggums@midwest.social 22 points 4 months ago

It would cost more in time and effort to do it right.

[–] Buffalox@lemmy.world 15 points 4 months ago* (last edited 4 months ago) (1 children)

Simply refer to the sources you used

Source: The Internet.

Most things are duplicated thousands of times on the Internet. So stating sources would very quickly become a bigger text than almost any answer from an AI.

But even disregarding that, as an example: Stating that you scraped republican and democrat home sites on a general publicly available site documenting the AI, does not explain which if any was used for answering a political question.

Your proposal sounds simple, but is probably extremely hard to implement in a useful way.

[–] gratux@lemmy.blahaj.zone 8 points 4 months ago

fundamentally, an llm doesn't "use" individual sources for any answer. it is just a function approximator, and as such every datapoint influences the result, just more if it closely aligns with the input.

[–] wewbull@feddit.uk 5 points 4 months ago (1 children)

They don't do it because they claim that there isn't enough public domain data.... But let's be honest, nobody has tried because nobody wants a machine that isn't able to reference anything in the last 100 years.

[–] Even_Adder@lemmy.dbzer0.com 4 points 4 months ago (2 children)

You should read this letter by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries.

Why are scholars and librarians so invested in protecting the precedent that training AI LLMs on copyright-protected works is a transformative fair use? Rachael G. Samberg, Timothy Vollmer, and Samantha Teremi (of UC Berkeley Library) recently wrote that maintaining the continued treatment of training AI models as fair use is “essential to protecting research,” including non-generative, nonprofit educational research methodologies like text and data mining (TDM). If fair use rights were overridden and licenses restricted researchers to training AI on public domain works, scholars would be limited in the scope of inquiries that can be made using AI tools. Works in the public domain are not representative of the full scope of culture, and training AI on public domain works would omit studies of contemporary history, culture, and society from the scholarly record, as Authors Alliance and LCA described in a recent petition to the US Copyright Office. Hampering researchers’ ability to interrogate modern in-copyright materials through a licensing regime would mean that research is less relevant and useful to the concerns of the day.

load more comments (2 replies)
load more comments (5 replies)
[–] Vince@lemmy.world 32 points 4 months ago (24 children)

Ok, dumb question time. I'm assuming no one has any significant issues, legal or otherwise, with a person studying all Van Gogh paintings, learning how to reproduce them, and using that knowledge to create new, derivative works and even selling them.

But when this is done with software, it seems wrong. I can't quite articulate why though. Is it because it takes much less effort? Anyone can press a button and do something that would presumably take the person from the example above years or decades to do? What if the person was somehow super talented and could do it in a week or a day?

[–] tyler@programming.dev 38 points 4 months ago
  1. Because it’s not human. We distinguish ourselves in everything, that’s why we think we’re special. The same applies to inventions, e.g. why monkeys can’t have a patent.
  2. Time. New “products” whether that be art, engineering, science, all take time for humans. So value is created with time, because it creates scarcity and demand.
  3. Talent. Due to the time factor, talent and practice are desired traits of a human. You mention that a talented human can do something in just a few days that might take someone else years, but it might only take them a few days because they spent years learning.
  4. Perfection. Striving for perfection is a human experience. A robot doing something perfect isn’t impressive, a human doing something perfect is amazing. Even the most amateur creator can strive for perfection.

Think about paintings vs prints. Paintings are much more valuable because they aren’t created as quickly as the prints are. Even the most amateur artwork is more valuable as a physical creation rather than a copy, like a child’s crayon drawing.

This even applies to digital art because the first instance of something is the most difficult thing to create, everything after that is then just a copy, and yes this does apply to some current Gen AI tech, but very soon that will no longer be the case.

This change from humans asking for something and having other humans create it to humans asking for something and having computers create it is a loss of our humanity, what makes us human.

[–] kibiz0r@midwest.social 20 points 4 months ago* (last edited 4 months ago)

If you're looking for a universally-applicable moral framework, join the thousands of years of philosophers striving for the same.

If you're just looking for an explanation that allows you to put one foot in front of the other...

Laws exist for us to spell out the kind of society we'd like to live in. Generally, we prefer that individuals be able to participate in cultural conversations and offer their own viewpoint. And generally, we prefer that groups of people don't accumulate massive amounts of power over other groups of people.

Dedicating your life to copying another artist's style is participating in a cultural conversation, and you won't be able to help yourself from infusing your own lived experience into your work of copying the artist. If only by the details that you focus on getting exactly right, the slight mistakes that repeat themselves or morph over the course of your career, the pieces you prioritize replicating over and over again. It says something about who you are, and that's worth appreciating.

Now, if you're trying to pass those off as originals and not your own tributes, then you're deceiving people and that's a problem because you're damaging the cultural conversation by lying about the elements you're putting into it. Even so, sometimes that's an interesting artistic enterprise in itself. Such as when artists pretend to be someone else. Warhol was a fan of this. His whole career revolved around messing with concepts of authenticity in art.

As for power, you don't gain that much leverage over another artist by simply copying their work. And if you riff on it to upstage them, you're just inviting them to do the same to you in turn.

But if you can do that mechanically, quickly, so that any creative twist they put out there to undermine your attempts to upstage them, you have an instant response at little cost to yourself, now you're in a position of great power. The more the original artist produces, the stronger your advantage over them becomes. The more they try, the harder it is for them to win.

We don't generally like when someone has accumulated tons of power, especially when they subsequently use that power to prevent others from being able to compete.

Edit: I'd also caution against trying to make an objective test for whether a particular act of copying is "okay". This invites two things:

  1. Artists can't help but question what's acceptable and play around with it. They will deliberately transgress in order to make a point, and you'll be forced to admit that your objective test is worthless.

  2. Tech companies are relentlessly horny for this kind of objective legal framework, because they want to be able to algorithmically approach the line and fill its border to fractal levels of granularity without technically crossing the line. RealPage, DoorDash, Uber, Amazon, OpenAI all want "illegal" to be as precisely and quantitatively defined as possible, so that they can optimize for "barely legal".

[–] aStonedSanta@lemm.ee 11 points 4 months ago (4 children)

They are copying your intellectual property and digitizing its knowledge. It’s a bit different as it’s PERMANENT. With humans knowledge can be lost, forgotten, or ignored. In these LLMs that’s not an option. Also the skill factor is a big issue imo. It’s very easy to setup an LLM to make AI imagery nowadays.

load more comments (4 replies)
[–] Eccitaze@yiffit.net 11 points 4 months ago

I actually had some thoughts about this and posted this in a similar thread:

First, that artist will only learn from a few handful of artists instead of every artist's entire field of work all at the same time. They will also eventually develop their own unique style and voice--the art they make will reflect their own views in some fashion, instead of being a poor facsimile of someone else's work.

Second, mimicking the style of other artists is a generally poor way of learning how to draw. Just leaping straight into mimicry doesn't really teach you any of the fundamentals like perspective, color theory, shading, anatomy, etc. Mimicking an artist that draws lots of side profiles of animals in neutral lighting might teach you how to draw a side profile of a rabbit, but you'll be fucked the instant you try to draw that same rabbit from the front, or if you want to draw a rabbit at sunset. There's a reason why artists do so many drawings of random shit like cones casting a shadow, or a mannequin doll doing a ballet pose, and it ain't because they find the subject interesting.

Third, an artist spends anywhere from dozens to hundreds of hours practicing. Even if someone sets out expressly to mimic someone else's style, teaches themselves the fundamentals, it's still months and years of hard work and practice, and a constant cycle of self-improvement, critique, and study. This applies to every artist, regardless of how naturally talented or gifted they are.

Fourth, there's a sort of natural bottleneck in how much art that artist can produce. The quality of a given piece of art scales roughly linearly with the time the artist spends on it, and even artists that specialize in speed painting can only produce maybe a dozen pieces of art a day, and that kind of pace is simply not sustainable for any length of time. So even in the least charitable scenario, where a hypothetical person explicitly sets out to mimic a popular artist's style in order to leech off their success, it's extremely difficult for the mimic to produce enough output to truly threaten their victim's livelihood. In comparison, an AI can churn out dozens or hundreds of images in a day, easily drowning out the artist's output.

And one last, very important point: artists who trace other people's artwork and upload the traced art as their own are almost universally reviled in the art community. Getting caught tracing art is an almost guaranteed way to get yourself blacklisted from every art community and banned from every major art website I know of, especially if you're claiming it's your own original work. The only way it's even mildly acceptable is if the tracer explicitly says "this is traced artwork for practice, here's a link to the original piece, the artist gave full permission for me to post this." Every other creative community writing and music takes a similarly dim views of plagiarism, though it's much harder to prove outright than with art. Given this, why should the art community treat someone differently just because they laundered their plagiarism with some vector multiplication?

[–] Cornelius_Wangenheim@lemmy.world 9 points 4 months ago* (last edited 4 months ago) (1 children)

Artists who rips off other great works are still developing their talent and skills. They can then go on to use to make original works. The machine will never produce anything original. It is only capable of mixing together things it has seen in its training set.

There is a very real danger that of ai eviscerating the ability for artists to make a living, making it where very few people will have the financial ability to practice their craft day in and day out, resulting in a dearth of good original art.

[–] Dkarma@lemmy.world 5 points 4 months ago

The machine will never produce anything original. It is only capable of mixing together things it has seen in its training set.

This is patently false and shows you don't know a single thing about how ai works.

[–] MinFapper@startrek.website 9 points 4 months ago* (last edited 4 months ago)

So, before the invention of the camera, the most valuable and most popular creative skill was replicating people on canvas as realistically as possible. Yes, we remember famous exceptions like Picasso, but by sheer number of paintings the most common were portraits of rich people.

After the cameras took that job away, prevailing art changed to become more abstract and "creative". But that still pissed off a lot of people that had spent a very long time honing a skill that was now no longer in demand.

What we're seeing is a similar shift. I think future generations of artists will value color theory, composition, etc. over specific brush stroke techniques. AI will make art much more accessible once enough time has passed for AI assisted art to be considered art. Make no mistake: it will always be people that actually create the art - AI will just reduce/remove the grunt work so they can focus more on creativity.

Now, whether billion dollar corporations deserve to exploit the labor of millions of people is a whole separate conversation, but tl;dr: they don't, but they're going to anyway because there is little to stop them in correct economic/governance models.

[–] FooBarrington@lemmy.world 6 points 4 months ago (2 children)

There's a simple argument: when a human studies Van Gogh and develops their own style based on it, it's only a single person with very limited output (they can only paint so much in a single day).

With AI you can train a model on Van Gogh and similar paintings, and infinitely replicate this knowledge. The output is almost unlimited.

This means that the skills of every single human artist are suddenly worth less, and the possessions of the rich are suddenly worth more. Wealth concentration is poison for a society, especially when we are still reliant on jobs for survival.

AI is problematic as long as it shifts power and wealth away from workers.

load more comments (2 replies)
[–] taaz@biglemmowski.win 4 points 4 months ago (1 children)

I am guessing the closest opposite argument would be how close it is to outright copying the original work?

load more comments (1 replies)
load more comments (16 replies)
[–] JiveTurkey@lemmy.world 31 points 4 months ago (1 children)

Good for this guy. Fuck AI and the companies responsible.

[–] xenoclast@lemmy.world 14 points 4 months ago (2 children)

Capitalism is the problem. Greed is the reason. I like that shitty idiots are fighting other shitty idiots because I think it's funny.. but neither parties are good guys

[–] mm_maybe@sh.itjust.works 4 points 4 months ago (1 children)

Capitalism is precisely the problem, because if the end product were never sold nor used in any commercial capacity, the case for "fair use" would be almost impossible to challenge. They're betting on judges siding with them in extending a very specific interpretation of fair use that has been successfully applied to digital copying of content for archival and distribution as in e.g. Google Books or the Internet Archive, which is also not air-tight, just precedent.

Even fair uses of media may not respect the dignity of the creators of works used to create "media synthesizers". In other words, even if a computer science grad student does a bunch of scraping for their machine learning dissertation, unless they ask and get permission from the creators, their research isn't upholding the principle of data dignity, which current law doesn't address at all, but is obviously the real issue upsetting people about "Generative AI".

load more comments (1 replies)
load more comments (1 replies)
[–] werefreeatlast@lemmy.world 22 points 4 months ago

Have you or a friend used YouTube or reddit in the past 10 years? Then you're entitled to compensation for the training of AI.

[–] JustZ@lemmy.world 13 points 4 months ago* (last edited 4 months ago) (4 children)

No, AI does not create new ~~derivative~~ transformative works. Copyright law is very clear that the thing that is copyrightable is that modicum of creativity, reduced to a tangible medium of expression, that society must encourage and protect.

Derivative works need even more creativity to be protectable than original works because it has to be so newly creative as to be a different work, transformative, even though the original may still be very recognizable.

An AI system does not have creativity. At best, it could mimic someone who is creative, but it could never have creativity on its own. It is generative, not creative.

It's like that monkey that took a nice picture, but the picture was not copyrightable because the person seeking to enforce the copyright didn't create the work. It's creativity that the Constitution seeks to encourage by the copyright clause.

[–] doodledup@lemmy.world 10 points 4 months ago (1 children)

You can make new derivative work without being creative. Just look at all the YouTubers copying each other.

load more comments (1 replies)
[–] Knock_Knock_Lemmy_In@lemmy.world 4 points 4 months ago (1 children)

it has to be so newly creative as to be a different work, even though the original may still be recognizable

Your definition implies Andy Warhol wasn't creative.

load more comments (1 replies)
[–] ArmokGoB@lemmy.dbzer0.com 4 points 4 months ago

The AI doesn't need creativity because the "A" in "AI" stands for "artificial," not "autonomous." It's a tool. Someone is controlling the output by setting the input parameters.

load more comments (1 replies)
[–] phoenixz@lemmy.ca 9 points 4 months ago (1 children)

As much as I hate google, they're not wrong

[–] doodledup@lemmy.world 6 points 4 months ago

Google is long doing the same thing.

load more comments
view more: next ›