this post was submitted on 18 Oct 2023
4 points (75.0% liked)
PC Gaming
8501 readers
321 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Wouldn't that just be a GPU? That's literally what all our AIs run on. Just a ton of tiny little processors running in parallel.
That kind of like saying "Wouldn't that just be a CPU?" about the GPU. It can be optimized. The question is if it's worth optimizing for on a consumer level, like GPUs were.
While that is true now, in the future maybe there will be discrete hardware AI accelerators in the same way we have hardware video encoding.
Have you not seen the size of modern GPU's? It'll just be another chip on the 3.5 slot 600w CPU.
They already exist.
They're meaning something more along the lines of an ASIC. A board specifically engineered for AI/ML.