this post was submitted on 16 May 2025
89 points (96.8% liked)
Technology
70531 readers
3389 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs have no more beliefs than a parrot does. They just repeat whatever opinions/biases exist in their training data. Although, that's not too different from humans in some respects.
Humans can be held accountable
*not all humans. Apparently. Like billionaires and the presidents they bought.
Less. A parrot can believe that it's going to get a cracker.
You could make an AI that had that belief too, and an LLM might be a component of such a system, but our existing systems don't do anything like that.
I know someone with a Parrot, he definitely has core beliefs, mostly about how much attention you should pay to him and food.