this post was submitted on 22 Dec 2024
59 points (90.4% liked)
Technology
60052 readers
2818 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Alignment is cargo cult lingo.
For LLMs specifically, or do you mean that goal alignment is some made up idea? I disagree on either, but if you infer there is no such thing as miscommunication or hiding true intentions, that's a whole other discussion.
Cargo cult pretends to be the thing, but just goes through the motions. You say alignment, alignment with what exactly?
Alignment is short for goal alignment. Some would argue that alignment suggests a need for intelligence or awareness and so LLMs can't have this problem, but a simple program that seems to be doing what you want it to do as it runs but then does something totally different in the end is also misaligned. Such a program is also much easier to test and debug than AI neural nets.
Aligned with who's goals exactly? Yours? Mine? At which time? What about future superintelligent me?
How do you measure alignment? How do you prove conservation of this property along open ended evolution of a system embedded into above context? How do you make it a constructive proof?
You see, unless you can answer above questions meaningfully you're engaging in a cargo cult activity.