this post was submitted on 20 Jul 2023
61 points (100.0% liked)
Technology
37708 readers
406 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Isn't this already possible though? Granted, AI can do this exponentially faster, write the article generate deepfakes and then publish or whatever. But... Again, can't just regular people already do this if they want? I mean, with the obvious aside, it's not AI that are generating deepfakes of politicians and celebrities, it's people using the tool.
It's been said already, but AI as a tool can be abused just like anything else. It's not AI that is unethical (necessarily), it is humans that use it unethically.
I dunno. I guess I just think about the early internet and the amount of shareware and forwards-from-grandma (if you read this letter you have 5 seconds to share it, early 2000's type stuff) and how it's evolved into text to speech DIY crafts. AI is just the next step that we were already headed down. We were doing all this without AI already, it's just so much more accessible now (which IMO, is the only way for AI to truly be used for good. Either it's 100% accessible for all or it's hoarded away.)
This also means that there are going to be people who use it for shitty reasons. These are the same types of people for why we have signs and laws in the first place.
It seems to come down to do we let something that can do harm be used despite it? I think there's levels, but I think the potential for good is just as high as the potential for disaster. It seems wrong to stop the use of AI possibly finding cures for cancer and genetic sequencing for other ailments just because some creeps can use it for deepfakes. Otherwise, the deepfakes would still have existed without AI and we would be without any of the benefits that AI could give us.
Note: for as interested and hopeful as I am for AI as a tool, I also try to be very aware of how harmful it could be. But most ways I look at it, somehow people like you and I using AI in various ways for personal projects, good or bad, just seems inconcequntial compared to the sheer speed with which AI can create. Be it code, assets, images, text, puzzles and patterns, we have one of our first major technological advancements and half of us are arguing over who gets to use it and why they shouldn't.
Last little tidbit: think about AI art/concepts you've seen in the last year. Gollum as playing cards, teenage mutant ninja turtles as celebs, deepfakes, whathaveyou. Think about the last time you saw AI art. Do you feel as awed/impressed/annoyed by the AI art of last year to the AI art of yesterday? Probably not, you probably saw it, thought AI, and moved on.
I've got a social hypothesis that this is what deepfake generations are going to be like. It doesn't matter what images get created because a thousand other creeps had the same idea and posted the same thing. At a certain point the desensitization onsets and it becomes redundant. So just because this can happen slightly more easily, we are going to sideline all of the rest of the good the tool can do?
Don't get me wrong, I don't disagree by any means. It's an interesting place to be stuck, I'm just personally hoping the solution is pro-consumer. I really think a version of current AI could be a massive gain to people's daily lives, even if it's just for hobbies and mild productivity. But it only works if everyone gets it.