this post was submitted on 22 Aug 2024
834 points (96.4% liked)

Programmer Humor

19503 readers
1091 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] exocortex@discuss.tchncs.de 21 points 2 months ago (2 children)

It cannot "analyze" it. It's fundamentally not how LLM's work. The LLM has a finite set of "tokens": words and word-pieces like "dog", "house", but also like "berry" and "straw" or "rasp". When it reads the input it splits the words into the recognized tokens. It's like a lookup table. The input becomes "token15, token20043, token1923, token984, token1234, ..." and so on. The LLM "thinks" of these tokens as coordinates in a very high dimensional space. But it cannot go back and examine the actual contents (letters) in each token. It has to get the information about the number or "r" from somewhere else. So it has likely ingested some texts where the number of "r"s in strawberry is discussed. But it can never actually "test" it.

A completely new architecture or paradigm is needed to make these LLM's capable of reading letter by letter and keep some kind of count-memory.

[–] pyre@lemmy.world 18 points 2 months ago (1 children)

the sheer audacity to call this shit intelligence is making me angrier every day

[–] Matriks404@lemmy.world -5 points 2 months ago

Exactly my point. But thanks for explaining it further.