this post was submitted on 23 Dec 2023
173 points (86.2% liked)

Technology

59174 readers
2177 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] OrderedChaos@lemmy.world 10 points 10 months ago (2 children)

I'm confused by this idea. Maybe I'm just seeing it from the wrong point of view. If you asked me to do the same thing I would fail miserably.

[–] KairuByte@lemmy.dbzer0.com 5 points 10 months ago

Not the original intent, but you’d likely immediately throw your hands up and say you don’t know, an LLM would hallucinate an answer.

[–] bionicjoey@lemmy.ca 1 points 10 months ago (1 children)

But some humans can, since they require simultaneous understanding of words' meanings as well as how they are spelled

[–] General_Effort@lemmy.world 2 points 10 months ago

What should we conclude about most humans who cannot solve these crosswords?

It should be relatively easy to train an LLM to solve these puzzles. I am not sure what that would show.