this post was submitted on 18 Oct 2023
4 points (75.0% liked)

PC Gaming

8501 readers
321 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
 

The Nvidia NV1 was released in 1995, it was the first GPU with 3D capabilities for PC... form there we know how things went by.

Now it's 2023, so let's make some "retro futuristic" prediction... what would you think about a AI board, open source driver, open API as Vulkan which you can buy to power the AI for your videogames? It would make sense to you? Which price range it should be?

What's supposed to do for your games... well, that's depend on videogames. The quickiest example I can think of is having endless discussion with your NPC in your average, single player, Fantasy RPG.

For example, the videogame load your 4~5 companions with the psychology/behaviors: they are fixated with the main quest goal (like you talk with fanatic people, this to make sure the game the main quest is as much stable as possible) but you can "break them" by making attempt to reveal some truths (for example, breaking the fourth wall), and if you go for this path, the game warns that you're probably going to lock out the main quest (like in Morrowind when you kill essential NPC)

you are viewing a single comment's thread
view the rest of the comments
[–] Blamemeta@lemm.ee 2 points 1 year ago (3 children)

Wouldn't that just be a GPU? That's literally what all our AIs run on. Just a ton of tiny little processors running in parallel.

[–] wccrawford@lemmy.world 7 points 1 year ago

That kind of like saying "Wouldn't that just be a CPU?" about the GPU. It can be optimized. The question is if it's worth optimizing for on a consumer level, like GPUs were.

[–] meteokr@community.adiquaints.moe 4 points 1 year ago (2 children)

While that is true now, in the future maybe there will be discrete hardware AI accelerators in the same way we have hardware video encoding.

[–] baconisaveg@lemmy.ca 3 points 1 year ago

Have you not seen the size of modern GPU's? It'll just be another chip on the 3.5 slot 600w CPU.

They already exist.

They're meaning something more along the lines of an ASIC. A board specifically engineered for AI/ML.