218
submitted 8 months ago by admin@beehaw.org to c/technology@beehaw.org
all 40 comments
sorted by: hot top controversial new old
[-] scrubbles@poptalk.scrubbles.tech 63 points 8 months ago

AI could probably fly under the radar if they just didn't do stupid stuff like this, but they just have to push the boundaries. If they made any number of fake voices it'd be fine, but no, had to do a celebrity. I hope they lose. Stupid stupid stupid marketing department.

[-] FlashMobOfOne@beehaw.org 60 points 8 months ago

I think it's inevitable.

The bad actors stealing data to train their apps don't seem to have an adequate understanding of the implications of their actions. They're just looking to make a quick buck and run.

Bring on the lawsuits.

[-] scrubbles@poptalk.scrubbles.tech 27 points 8 months ago

For sure, they've been just scraping everything for so long without a care. It's about time they start facing it.

[-] agressivelyPassive@feddit.de 9 points 8 months ago

Actually, I think this is a legally very interesting area.

At the end of the day, AIs are just fancy imitations. Nobody would sue someone for imitating a voice, as long as it's not impersonation (in the legal sense).

[-] CosmoNova@feddit.de 16 points 8 months ago

I think you misunderstand something. The same thing many AI enthusiasts and critics often choose to not understand. Regenerative AIs aren‘t just born from plain code and they don’t just imitate. They use a ton of data as reference points. It’s literally in the name of the technology.

You could claim „well maybe they used different voices and mixed them together“ but that is highly unlikely, given how much of a wild west approach most regenerative AI services have. it‘s more likely they used protected property here in a way it was not intended to be used. In which case SJ does indeed have a legal case here.

[-] bioemerl@kbin.social 6 points 8 months ago

They use a ton of data as reference points. It’s literally in the name of the technology.

Reference is the wrong word.

They learn the patterns that exist in data and are able to predict future patterns.

They don't actually reference the source material during generation (barring over itting which can happen and is roughly akin to a human memorizing something and reproducing it).

[-] sonori@beehaw.org 12 points 8 months ago

Weather or not the copyrighted data shows up in the final model is utterly irrelevant though. It is illegal to use copyrighted material period outside of fair use, and this is most certainly not. This is civil law, not criminal, the standard is more likely than not rather than beyond a reasonable doubt. If a company cannot provide reasonable evidence that they created the model entirely with material they own the rights to use for that purpose, than it is a violation of the law.

Math isn’t a person, doesn’t learn in anything approaching the same method beyond some unrelated terminology, and has none of the legal rights that we afford to people. If it did, than this would be by definition a kidnapping and child abuse case not a copyright case.

[-] bioemerl@kbin.social 4 points 8 months ago* (last edited 8 months ago)

It is illegal to use copyrighted material period outside of fair use, and this is most certainly not.

Yeah it is. Even assuming fair use applied, fair use is largely a question of how much a work is transformed and (a billion images) -> AI model is just about the most transformative use case out there.

And this assumes this matters when they're literally not copying the original work (barring over fitting). It's a public internet download. The "copy" is made by Facebook or whoever you uploaded the image to.

The model doesn't contain the original artwork or parts of it. Stable diffusion literally has one byte per image of training data.

[-] admiralteal@kbin.social 6 points 8 months ago* (last edited 8 months ago)

I never understood why so many from the more techbro political alignment find this argument so convincing.

It doesn't really matter whether the original data is present in the model or if it was reduced to such an abstract form that we cannot find it anymore. The model only can exist because of the original data being used to make it, and it was used without proper license. It doesn't matter how effective nor how lossy your compression is, mere compression is not transformation and does not wash away copyright.

The argument that it is in some way transformative is more relevant. But it's also got a pretty heavy snort of "thinking like a cop" in it, fundamentally. Yes, the law protects transformative works, so if we only care what the written rules of the law says, then if we can demonstrate that what the AI does is transformative, the copyright issues go away. This isn't a slam dunk argument that there's nothing wrong with what an AI does even if we grant it is transformative. It may also simply be proving that the copyright law we have fails to protect artists in the new era of AI.

In a truly ideal world, we wouldn't have copyright. At all. All these things would be available and offered freely to everyone. All works would be public domain. And artists who contributed to the useful arts and sciences would be well-fed, happy, and thriving. But we don't live in that ideal world, so instead we have copyright law. The alternative is that artists cannot earn a living on their works.

[-] bioemerl@kbin.social 2 points 8 months ago

It doesn't really matter whether the original data is present in the model

Yeah it does. One of the arguments people make is that AI models are just a form of compression, and as a result distributing the model is akin to distributing all the component parts. This fact invalidates that argument.

This isn't a slam dunk argument that there's nothing wrong with what an AI does even if we grant it is transformative. It may also simply be proving that the copyright law we have fails to protect artists in the new era of AI.

If we change the law to make it illegal it's illegal.

[-] brie@beehaw.org 5 points 8 months ago

The number of bytes per image doesn't necessarily mean there's no copying of the original data. There are examples of some images being "compressed" (lossily) by Stable Diffusion; in that case the images were specifically sought out, but I think it does show that overfitting is an issue, even if the model is small enough to ensure it doesn't overfit for every image.

[-] bioemerl@kbin.social 1 points 8 months ago

Over fitting is an issue for the images that were overfit. But note in that article that those images mostly appeared many times in the data set.

People who own the rights to one of those images have a valid argument. Everyone else doesn't.

[-] Overzeetop@beehaw.org 2 points 8 months ago* (last edited 8 months ago)

It is illegal to use copyrighted material

It's illegal to reproduce copyrighted material*. That includes changing the format as well as things which fall under "derivative" works, but not creating a new work in the style of someone else's (unless it falls under the derivative definition). Many voice impersonators exist and the way you impersonate a voice is to listen to (usually) recordings of that person and practice producing the same sounds that they use for common phonemes (as well as vocal tract shape and larynx positioning to alter the vocal pitch production and overtones which represent vowel shapes). ML does, effectively the same thing without requiring a human to do the listening and practicing.

That said, I think this type of use should be strictly prohibited. In fact, I think it should have severe criminal penalties for any specific voice, not just celebrities. Having the ability to simulate accurate, regional-sounded voices is extremely valuable in the general sense, but imitating or mimicking a specific person's voice without their explicit consent and/or direction has very few, if any, legitimate uses.

* I didn't think that voice mimicking would count as valid for any law, but Google tells me of the "Right of Publicity" and there is (again according to Google) case law involving Ford and Bette Midler. So while it's not a copyright violation to reproduce a voice, it may still run afoul of some laws.

[-] agressivelyPassive@feddit.de 3 points 8 months ago

Again: how is that different from an imitation? What exactly differentiates a human watching a movie to imitate a voice from a machine doing the same thing?

And that is, what you misunderstand. AI is not magic, it's computation. Nothing more. In no other context would it even matter, whether the source data was intended for the use case, if no infringement is being committed by the end product.

[-] scrubbles@poptalk.scrubbles.tech 5 points 8 months ago

It's a hard one. You train a general AI and ask for a story idea, that's not a huge deal IMO. You ask it to write in the style of George RR Martin or something that's something different. Yes you can do it by hand too, but these tools make it easier than ever.

Then sub questions... Is it okay to do it for free? What if you distribute it? What if you charge for it? All questions that these ai companies are just ignoring when they potentially have massive ramifications.

Making a random avatar is fine. Using ScarJo is iffy if you're using it for free. What if you're streaming on twitch with her? What if you're charging to use her likeness on twitch where the users will make money? Idk the answers to any of those.

[-] agressivelyPassive@feddit.de 3 points 8 months ago

But why would anyone commit anything fraudulent by that? Where exactly does it become "too much" AI?

I did it very iffy to argue that writing in the style of someone else is illegal. That's a perfectly normal thing to happen. Maybe AI makes it easier, but if an action is not illegal, why would doing the same thing tool assisted be illegal? Doesn't make sense.

[-] lol3droflxp@kbin.social 4 points 8 months ago

I think that writing in someone else’s style to an extent that it becomes very obvious is indeed something that raises copyright concerns.

[-] agressivelyPassive@feddit.de 5 points 8 months ago

No, how would it? You can't copyright a style.

[-] lol3droflxp@kbin.social 2 points 8 months ago

Well, there have been several music lawsuits about certain songs and their amount of identity to others. If you were to write something as closely to another author that you are imitating something like trademark mannerisms there may be a case for that.

[-] agressivelyPassive@feddit.de 1 points 8 months ago

Identity and style are two completely different things, though.

In literature, there are no trademark mannerisms.

[-] ThunderingJerboa@kbin.social 7 points 8 months ago* (last edited 8 months ago)

I mean depends on where they are from. If they are from the US or Europe they would be fucking idiots but if they are Chinese, Russian, etc they are basically untouchable and it will merely be a game of whackamole.

Edit: welp did a Whois on their website and seems its from Arizona. So yeah nevermind my top comment, if this is truly a company stationed in Arizona they really fucked up.

[-] remotelove@lemmy.ca 9 points 8 months ago

The website is hosted out of Arizona, but that is about it.

You can get the full company name from their privacy policy here: https://lisaai.app/privacy.html

If you Google the name, the company is registered in Instanbul: https://www.firmabulucu.com/isletme/convert-yazilim-limited-sirketi/

[-] Endorkend@kbin.social 36 points 8 months ago

This is getting out of hand rather quickly.

I recently was watching some feelgood videos to up my mood (stuff like Thedodo) and one of the channels I landed on, the voice instantly sounded extremely familiar.

I thought "oh, did The Girl with the Dogs start another channel?" but then I listened more carefully and noticed the typical "generated" fragments in the audio.

They aren't just copying the voices of celebrities, but also of popular YouTubers.

[-] goldenbug@kbin.social 8 points 8 months ago

Wtffff

I use her videos to focus on my tasks. I would rather she would profit from it.

[-] Endorkend@kbin.social 3 points 8 months ago

I watch them to spectate the inevitable moment she grooms a bonafide ice bear.

[-] goldenbug@kbin.social 1 points 8 months ago

'Shhhh good boy' with a bolt cutter in her hands trying to get 2mm of nail of its paw.

[-] voidf1sh@lemm.ee 14 points 8 months ago

We had 'Snoop Dogg' call us at work last week and leave a voicemail talking about some class action lawsuit. Wack

[-] shiveyarbles@beehaw.org 9 points 8 months ago

Wiggidy wack, or the regular kind?

[-] sculd@beehaw.org 12 points 8 months ago

Please destroy these unethical app! They can be used for so many bad applications: scams, misinformation, identity theft....

[-] elouboub@kbin.social 12 points 8 months ago

Would be hilarious if the picture there were also AI generated (it sure does look like it to me).

[-] perishthethought@lemm.ee 1 points 8 months ago

I was gonna say - I thought this was about Scarlett. Who's that in the picture?

[-] Linnce@beehaw.org 1 points 8 months ago

At first glance I thought it was Millie Bobby Brown

[-] Send_me_nude_girls@feddit.de 9 points 8 months ago

So what tools did they use to create fake AI voice of her? Asking for a friend.

[-] bioemerl@kbin.social 6 points 8 months ago* (last edited 8 months ago)
[-] Send_me_nude_girls@feddit.de 1 points 8 months ago* (last edited 8 months ago)

Thanks, I'll check it out. I tried TorToiSe before but couldn't get it to run.

[-] ursakhiin@beehaw.org 3 points 8 months ago

Well this isn't creepy u/send_me_nude_girls

[-] autotldr@lemmings.world 8 points 8 months ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryScarlett Johansson is taking legal action against an AI app developer for using her name and likeness in an online ad, according to a report from Variety.

As reported by Variety, the 22-second ad showed Johansson behind the scenes while filming Black Widow, where she actually says “What’s up guys?

It’s Scarlett and I want you to come with me.” But then, the ad transitions away from Johansson, while an AI-generated voice meant to sound like the actress states: “It’s not limited to avatars only.

At the very bottom of the ad, Variety reports that Convert Software — the developer behind the app — included text that reads: “Images produced by Lisa AI.

It has nothing to do with this person.” Representatives for Johansson tell Variety that the actress was never a spokesperson for the app and that her attorney, Kevin Yorn, “handled the situation in a legal capacity.”

Neither Yorn nor Convert Software responded to The Verge’s request for comment about the nature of the legal action.


Saved 46% of original text.

[-] purpledonkey@beehaw.org 5 points 8 months ago
this post was submitted on 02 Nov 2023
218 points (100.0% liked)

Technology

37208 readers
428 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS