88
submitted 5 months ago by Gaywallet@beehaw.org to c/technology@beehaw.org
you are viewing a single comment's thread
view the rest of the comments
[-] webghost0101@sopuli.xyz 8 points 5 months ago

They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put "(((nude, nudity, naked, sexual, violence, gore)))" as the negative prompt

[-] megopie@beehaw.org 7 points 5 months ago

The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.

this post was submitted on 16 Jan 2024
88 points (100.0% liked)

Technology

37208 readers
119 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS