this post was submitted on 05 Feb 2025
493 points (96.8% liked)

Greentext

5001 readers
735 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] IndustryStandard@lemmy.world 2 points 2 hours ago

Anon volunteers for Neuralink

[–] xelar@lemmy.ml 6 points 3 hours ago* (last edited 3 hours ago)

Brainless GPT coding is becoming a new norm on uni.

Even if I get the code via Chat GPT I try to understand what it does. How you gonna maintain these hundreds of lines if you dont know how does it work?

Not to mention, you won't cheat out your way on recruitment meeting.

[–] Fleur_@lemm.ee 18 points 13 hours ago (1 children)

Why would you even be taking the course at this point

[–] UltraGiGaGigantic@lemmy.ml 23 points 11 hours ago (2 children)

Money can be exchanged for housing, food, healthcare, and more necessities.

[–] _stranger_@lemmy.world 5 points 3 hours ago

Yeah, Anon paid an AI to take the class he payed for. Setting his money on fire would have been more efficient.

[–] licheas@sh.itjust.works 6 points 13 hours ago (2 children)

Why do they even care? it's not like your future bosses are going to give a flying fuck how you get your code. at least, they won't until you cause the machine uprising or something.

[–] WoodScientist@sh.itjust.works 1 points 1 hour ago

They absolutely will. Companies hire programmers because they specifically need people who can code. Why would I hire someone to throw prompts into ChatGPT? I can do that myself. In the time it take me to write to an employee instructing them on the code I want them to create with ChatGPT, I could just throw a prompt into ChatGPT myself.

[–] fleg@szmer.info 5 points 4 hours ago

They are going to care if you can maintain your code. Programming isn't "write, throw it over the fence and forget about it", you usually have to work with what you - or your coworkers - have already done. "Reading other people's code" is, like, 95% of the programmers job. Sometimes the output of a week long, intensive work is a change in one line of code, which is a result of deep understanding of a project which can span through many files, sometimes many small applications connected with each other.

ChatGPT et al aren't good at that at all. Maybe they will be in the future, but at the moment they are not.

[–] nednobbins@lemm.ee 76 points 1 day ago (3 children)

The bullshit is that anon wouldn't be fsked at all.

If anon actually used ChatGPT to generate some code, memorize it, understand it well enough to explain it to a professor, and get a 90%, congratulations, that's called "studying".

[–] naught101@lemmy.world 6 points 11 hours ago (3 children)

I don't think that's true. That's like saying that watching hours of guitar YouTube is enough to learn to play. You need to practice too, and learn from mistakes.

[–] Maggoty@lemmy.world 1 points 2 hours ago

No he's right. Before ChatGPT there was Stack Overflow. A lot of learning to code is learning to search up solutions on the Internet. The crucial thing is to learn why that solution works though. The idea of memorizing code like a language is impossible. You'll obviously memorize some common stuff but things change really fast in the programming world.

[–] RobertoOberto@sh.itjust.works 5 points 6 hours ago (1 children)

I don't think that's quite accurate.

The "understand it well enough to explain it to a professor" clause is carrying a lot of weight here - if that part is fulfilled, then yeah, you're actually learning something.

Unless of course, all of the professors are awful at their jobs too. Most of mine were pretty good at asking very pointed questions to figure out what you actually know, and could easily unmask a bullshit artist with a short conversation.

[–] naught101@lemmy.world 1 points 2 hours ago

I didn't say you'd learn nothing, but the second take was not just to explain (when you'd have the code in front of you to look at), but to actually write new code, for a new problem, from scratch.

[–] nednobbins@lemm.ee 3 points 9 hours ago (1 children)

It's more like if played a song on Guitar Hero enough to be able to pick up a guitar and convince a guitarist that you know the song.

Code from ChatGPT (and other LLMs) doesn't usually work on the first try. You need to go fix and add code just to get it to compile. If you actually want it to do whatever your professor is asking you for, you need to understand the code well enough to edit it.

It's easy to try for yourself. You can go find some simple programming challenges online and see if you can get ChatGPT to solve a bunch of them for you without having to dive in and learn the code.

[–] WarlordSdocy@lemmy.world 2 points 4 hours ago

I mean I feel like depending on what kind of problems they started off with ChatGPT probably could just solve simple first year programming problems. But yeah as you get to higher level classes it will definitely not fully solve the stuff for you and you'd have to actually go in and fix it.

[–] MintyAnt@lemmy.world 23 points 23 hours ago

Professors hate this one weird trick called "studying"

[–] JustAnotherKay@lemmy.world 13 points 1 day ago (1 children)

Yeah, if you memorized the code and it's functionality well enough to explain it in a way that successfully bullshit someone who can sight-read it... You know how that code works. You might need a linter, but you know how that code works and can probably at least fumble your way through a shitty 0.5v of it

load more comments (1 replies)
[–] SkunkWorkz@lemmy.world 101 points 1 day ago (30 children)

Yeah fake. No way you can get 90%+ using chatGPT without understanding code. LLMs barf out so much nonsense when it comes to code. You have to correct it frequently to make it spit out working code.

[–] WoodScientist@sh.itjust.works 1 points 1 hour ago

Two words: partial credit.

[–] Maggoty@lemmy.world 1 points 2 hours ago

Usually this joke is run with a second point of view saying, do I tell them or let them keep thinking this is cheating?

[–] AeonFelis@lemmy.world 4 points 12 hours ago
  1. Ask ChatGPT for a solution.
  2. Try to run the solution. It doesn't work.
  3. Post the solution online as something you wrote all on your own, and ask people what's wrong with it.
  4. Copy-paste the fixed-by-actual-human solution from the replies.
[–] Artyom@lemm.ee 10 points 21 hours ago

If we're talking about freshman CS 101, where every assignment is the same year-over-year and it's all machine graded, yes, 90% is definitely possible because an LLM can essentially act as a database of all problems and all solutions. A grad student TA can probably see through his "explanations", but they're probably tired from their endless stack of work, so why bother?

If we're talking about a 400 level CS class, this kid's screwed and even someone who's mastered the fundamentals will struggle through advanced algorithms and reconciling math ideas with hands-on-keyboard software.

load more comments (26 replies)
[–] NigelFrobisher@aussie.zone 6 points 20 hours ago* (last edited 20 hours ago)

I remember so little from my studies I do tend to wonder if it would really have cheating to… er… cheat. Higher education was like this horrendous ordeal where I had to perform insane memorisation tasks between binge drinking, and all so I could get my foot in the door as a dev and then start learning real skills on the job (e.g. “agile” didn’t even exist yet then, only XP. Build servers and source control were in their infancy. Unit tests the distant dreams of a madman.)

[–] PlantDadManGuy@lemmy.world 5 points 19 hours ago

I mean at this point just commit to the fraud and pay someone who actually knows how to code to take your exam for you.

[–] SoftestSapphic@lemmy.world 52 points 1 day ago (9 children)

This person is LARPing as a CS major on 4chan

It's not possible to write functional code without understanding it, even with ChatGPT's help.

load more comments (9 replies)
[–] janus2@lemmy.zip 33 points 1 day ago* (last edited 1 day ago) (6 children)

isn't it kinda dumb to have coding exams that aren't open book? if you don't understand the material, on a well-designed test you'll run out of time even with access to the entire internet

when in the hell would you ever be coding IRL without access to language documentation and the internet? isn't the point of a class to prepare you for actual coding you'll be doing in the future?

disclaimer did not major in CS. but did have a lot of open book tests—failed when I should have failed because I didn't study enough, and passed when I should have passed because the familiarity with the material is what allows you to find your references fast enough to complete the test

[–] Buddahriffic@lemmy.world 12 points 1 day ago

Assignments involved actual coding but exams were generally pen and paper when I got my degree. If a question involved coding, they were just looking for a general understanding and didn't nitpick syntax. The "language" used was more of a c++-like pseudocode than any real specific language.

ChatGPT could probably do well on such exams because making up functions is fair game, as long as it doesn't trivialize the question and demonstrates an overall understanding.

[–] bitchkat@lemmy.world 8 points 1 day ago

Most of my CS exams in more advanced classes were take home. Well before the internet though. They were some of the best finals I ever took.

load more comments (4 replies)
[–] UnfairUtan@lemmy.world 205 points 1 day ago* (last edited 1 day ago) (22 children)

https://nmn.gl/blog/ai-illiterate-programmers

Relevant quote

Every time we let AI solve a problem we could’ve solved ourselves, we’re trading long-term understanding for short-term productivity. We’re optimizing for today’s commit at the cost of tomorrow’s ability.

load more comments (22 replies)
[–] Xanza@lemm.ee 26 points 1 day ago* (last edited 1 day ago) (7 children)

pay for school

do anything to avoid actually learning

Why tho?

load more comments (7 replies)
[–] aliser@lemmy.world 101 points 1 day ago (1 children)
[–] Agent641@lemmy.world 66 points 1 day ago (1 children)

Probably promoted to middle management instead

load more comments (1 replies)
[–] Ascend910@lemmy.ml 5 points 1 day ago

virtual machine

[–] boletus@sh.itjust.works 74 points 1 day ago (25 children)

Why would you sign up to college to willfully learn nothing

load more comments (25 replies)
load more comments
view more: next ›