this post was submitted on 22 Aug 2024
831 points (96.4% liked)

Programmer Humor

19171 readers
1576 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] MystikIncarnate@lemmy.ca 5 points 3 weeks ago

To be fair, I knew a lot of people who struggled with word problems in math class.

[–] Rhaedas@fedia.io 4 points 3 weeks ago

I tried it with my abliterated local model, thinking that maybe its alteration would help, and it gave the same answer. I asked if it was sure and it then corrected itself (maybe reexamining the word in a different way?) I then asked how many Rs in "strawberries" thinking it would either see a new word and give the same incorrect answer, or since it was still in context focus it would say something about it also being 3 Rs. Nope. It said 4 Rs! I then said "really?", and it corrected itself once again.

LLMs are very useful as long as know how to maximize their power, and you don't assume whatever they spit out is absolutely right. I've had great luck using mine to help with programming (basically as a Google but formatting things far better than if I looked up stuff), but I've found some of the simplest errors in the middle of a lot of helpful things. It's at an assistant level, and you need to remember that assistant helps you, they don't do the work for you.

[–] mtchristo@lemm.ee 4 points 3 weeks ago

I stand with chat-gpt on this. Whoever created these double letters is the idiot here.

[–] tourist@lemmy.world 2 points 3 weeks ago

Is there anything else or anything else you would like to discuss? Perhaps anything else?

Anything else?

Tnf, this is the kind of answer a person might give if you asked them the question randomly.

load more comments
view more: ‹ prev next ›