this post was submitted on 23 Mar 2024
74 points (93.0% liked)

PC Gaming

8044 readers
477 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] PonyOfWar@pawb.social 28 points 5 months ago (3 children)

Honestly one of the AI applications I see real potential in. They can train the NPCs with an extensive backstory and the interactions with them could be way more dynamic than what we currently get for NPCs. Something like a more advanced version of "Starship Titanic", if anyone remembers that.

[–] MotoAsh@lemmy.world 8 points 5 months ago* (last edited 5 months ago) (2 children)

You are imagining a supercomputer's LLM running an NPC.

It literally cannot be that fancy. Maybe they can fake it and fool a few rubes, but no there will be no deep characters ran by this.

[–] PonyOfWar@pawb.social 4 points 5 months ago* (last edited 5 months ago)

The way it works right now is usually over the cloud. I've already tried out a bit of "Convai" as a developer, which is a platform where you can create LLM NPCs and put them in Unreal Engine. It's pretty neat, not perfect, but you can definitely give characters thousands of lines of backstory if you want and they will act in character. They will also remember any conversations a player had with them previously and can refer to them in later convos. Can still be fairly obvious that you're talking to an LLM though, if you know what to ask and what to look for. Due to its cloud-based nature, there is also some delay between the player input and the response. But it has a lot of potential for dialog systems where you can do way more than just choose between 4 predefined sentences. Especially once running these things locally won't be a performance-issue.

[–] owen@lemmy.ca 4 points 5 months ago

I think you could make it work by giving them each a limited word pool and pre-set phrases to cover for panic/confusion

[–] fruitycoder@sh.itjust.works 4 points 5 months ago

There are a couple indies and mods working on that! The trick definitely is to lower the power needed, maybe through a series of fine gunned models (might also lower the amount anacrinisms too)

[–] swab148@startrek.website 3 points 5 months ago

I still have my copy of the book!