this post was submitted on 19 Jan 2024
93 points (92.7% liked)

PC Gaming

8205 readers
554 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kakes@sh.itjust.works 1 points 7 months ago (1 children)

Well, they actually can, at least to an extent. All you need to do is encode the worldstate in a way the LLM can understand, then decode the LLM's response to that worldstate (most examples I've seen use JSON to good effect).

That doesn't seem to be the focus of most of these developers though, unfortunately.

[–] bionicjoey@lemmy.ca 3 points 7 months ago (1 children)

That assumes the model is trained on a large training set of the worldstate encoding and understands what that worldstate means in the context of its actions and responses. That's basically impossible with the state of language models we have now.

[–] kakes@sh.itjust.works 1 points 7 months ago

I disagree. Take this paper for example - keeping in mind it's a year old already (using ChatGPT 3.5-turbo).

The basic idea is pretty solid, honestly. Representing worldstate for an LLM is essentially the same as how you would represent it for something like a GOAP system anyway, so it's not a new idea by any stretch.