this post was submitted on 02 Dec 2024
65 points (88.2% liked)
Asklemmy
44170 readers
1327 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I use GPT in the sense of "I need to solve X problem, are there established algorithms for this?" which usually gives me a good starting point for actual searching.
Most recent use-case was judging the similarity of two strings: I had never heard of "Levenschtein distance" before, but once I had that keyword it was easy to work from there.
Also: cmake and bash boilerplate
Describing a concept and getting the term is awesome with an LLM.
Iโve found documentation and discussions of various strategies Iโm considering in tech work.
I describe my idea, the LLM gives me the existing term for that strategy, and then I can find discussion, guides, and theory about that. Keeps me from reinventing the wheel.
It makes sense when you think about it too: It's a language model, so it should be expected to do a decent job as a glorified dictionary