this post was submitted on 24 Feb 2024
239 points (91.9% liked)
Technology
60079 readers
3354 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A worldview is your current representational model of the world around you, so for example you know you're a human on earth in a physical universe when a set of rules, you have a mental representation of your body and it's capabilities, your location and the physicality of the things in your location. It can also be abstract things too, like your personality and your relationships and your understanding of what's capable in the world.
Basically, you live in reality, but you need a way to store a representation of that reality in your mind in order to be able to interact with and understand that reality.
The simulation part is your ability to imagine manipulating that reality to achieve a goal, and if you break that down, you're trying to convert reality from your perceived current real state A, to a imagined desired state B. Reasoning is coming up with a plan to convert the worldview from state A to state B step by step, so let's say you want to brush your teeth, you a want to convert your worldview of you having dirty teeth to you having clean teeth, and to do that you reason that you need to follow a few steps to achieve that, like moving your body to the bathroom, retrieving tools (toothbrush and toothpaste) and applying mechanical action to your teeth to clean them. You created a step by step plan to change the state of your worldview to a new desired state you came up with. It doesn't need to be physical either, it could be an abstract goal, like calculating a tip for a bill. It can also be a grand goal, like going to college or creating a mathematical proof.
LLMs don't have a representational model of the world, they don't have a working memory or a world simulation to use as a scratchpad for testing out reasoning. They just take a sequence of words and retrieve the next word that is probabilistically and relationally likely to be a good next word based on its training data.
They could be a really important cortex that can assist in developing a worldview model, but in their current granular state of being a single task AI model, they cannot do reasoning on their own.
Knowledge retrieval is an important component that assists in reasoning though, so it can still play a very important role in reasoning.
Interesting. I'm curious to know more about what you think of training datasets. Seems like they could be described as a stored representation of reality that maybe checks the boxes you laid out. It's a very different structure of representation than what we have as animals, but I'm not sure it can be brushed off as trivial. The way an AI interacts with a training dataset is mechanistic, but as you describe, human worldviews can be described in mechanistic terms as well (I do X because I believe Y).
You haven't said it, so I might be wrong, but are you pointing to freewill and imagination as somehow tied to intelligence in some necessary way?
I think worldview is all about simulation and maintaining state, it's not really about making associations, but rather maintaining some kind of up to date and imaginary state that you can simulate on top of, to represent the world. I think it needs to be a very dynamic thing which is a pretty different paradigm to the ML training methodology.
Yes, I view these things as foundational to freewill and imagination, but I'm trying to think more low level than that. Simulation facilities imagination and reasoning facilities motivation which facilities free will.
Are those things necessary for intelligence? Well it depends on your definition and everyone has a different definition ranging from reciting information to full blown consciousness. Personally, I don't really care about coming up with a rigid definition for it, it's just a word, I care more about the attributes. I think LLMs are a good knowledge engine and knowledge is a component of intelligence.