this post was submitted on 19 Oct 2023
516 points (96.6% liked)
Technology
59108 readers
3257 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I use LLMs for having things explained to me, too.. but if you want to know how much salt to pour in that soup, try asking it about something niche and complicated you already know the answer to.
They can be useful in figuring out the correct terminology so that you can find the answer on your own, or for pointing some very very obvious mistakes in your understandings (but it will still miss most of them).
Please don't use those things as answer machines.
I'm going to use those things as answer machines and you can't stop me.
Jokes aside, I always validate what chatbots tell me, not even just important things. I use GPT-4 for work and 90% of the time it can show me how to use very specific functions in complex ways, but yesterday (for the first time in awhile) it made up a function that didn't exist. To its credit, I said, "Are you sure about [function]?" and it said, "I'm sorry, I got confused. That function doesn't exist. However, look into X, Y, Z for further resources" and I did and they were the correct things to look into.
If you press it the same way again ("are you sure the function doesn't exist?"), there is a high chance it will "rectify" its rectification.