198
submitted 4 months ago* (last edited 4 months ago) by fievel@lemm.ee to c/asklemmy@lemmy.ml

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

(page 2) 50 comments
sorted by: hot top controversial new old
[-] ParsnipWitch@feddit.de 6 points 4 months ago

Your fear is in so far justified as that some employers will definitely aim to reduce their workforce by implementing AI workflow.

When you have worked for the same employer all this time, perhaps you don't know, but a lot of employers do not give two shits about code quality. They want cheap and fast labour and having less people churning out more is a good thing in their eyes, regardless of (long-term) quality. May sound cynical, but that is my experience.

My prediction is that the income gap will increase dramatically because good pay will be reserved for the truly exceptional few. While the rest will be confronted with yet another tool capitalists will use to increase profits.

Maybe very far down the line there is blissful utopia where no one has to work anymore. But between then and now, AI would have to get a lot better. Until then it will be mainly used by corporations to justify hiring less people.

[-] tunetardis@lemmy.ca 6 points 4 months ago

As a fellow C++ developer, I get the sense that ours is a community with a lot of specialization that may be a bit more difficult to automate out of existence than web designers or what have you? There's just not as large a sample base to train AIs on. My C++ projects have ranged from scientific modelling to my current task of writing drivers for custom instrumentation we're building at work. If an AI could interface with the OS I wrote from scratch for said instrumentation, I would be rather surprised? Of course, the flip side to job security through obscurity is that you may make yourself unemployable by becoming overly specialized? So there's that.

load more comments (4 replies)
[-] olbaidiablo@lemmy.ca 6 points 4 months ago

AI allows us to do more with less just like any other tool. It's no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.

[-] z00s@lemmy.world 5 points 4 months ago

It won't replace coders as such. There will be devs who use AI to help them be more productive, and there will be unemployed devs.

[-] arthur@lemmy.zip 5 points 4 months ago

Man, it's a tool. It will change things for us, it is very powerful; but still a tool. It does not "know" anything, there's no true intelligence in the things we now call "AI". For now, is really useful as a rubber duck, it can make interesting suggestions, make you explore big code bases faster, and even be useful for creating boilerplate. But the code it generates usually is not very trustworthy and have lower quality.

The reality is not that we will lose our jobs to it, but that companies will expect more productivity from us using these tools. I recommend you to try ChatGPT (the best in class for now), and try to understand it's strengths and limitations.

Remember: this is just an autocomplete on steroids, that do more the the regular version, but that get the same type of errors.

[-] howrar@lemmy.ca 5 points 4 months ago* (last edited 4 months ago)

If your job truly is in danger, then not touching AI tools isn't going to change that. The best you can do for yourself is to explore what these tools can do for you and figure out if they can help you become more productive so that you're not first on the chopping block. Maybe in doing so, you'll find other aspects of programming that you enjoy just as much and don't yet get automated away with these tools. Or maybe you'll find that they'll not all they're hyped up to be and ease your worry.

[-] Lath@kbin.social 4 points 4 months ago

If you are, it should be due to working for the wrong people. Those that don't understand what's what and only seek profit religiously.

Thanks for the readable code though.

[-] CanadaPlus@lemmy.sdf.org 4 points 4 months ago

Give Copilot or similar a try. AI or similar is pretty garbage at the more complex aspects of programming, but it's great at simple boilerplate code. At least for me, that doesn't seem like much of a loss.

[-] ulkesh@beehaw.org 4 points 4 months ago* (last edited 4 months ago)

I’m less worried and disturbed by the current thing people are calling AI than I am of the fact that every company seems to be jumping on the bandwagon and have zero idea how it can and should be applied to their business.

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

As for your points on job security — your trepidation is valid, but premature, by numerous decades, in my opinion. The moment companies start relying on these LLMs to do their programming for them is the moment they will inevitably end up with countless bugs and no one smart enough to fix them, including the so-called AI. LLMs seem interesting and useful on the surface, and a person can show many examples of this, but at the end of the day, it’s regurgitating fed content based on rules and measures with knob-tuning — I do not yet see objective strong evidence that it can effectively replace a senior developer.

[-] knightly@pawb.social 3 points 4 months ago

Companies are going to waste money on it, drive up costs, and make the consumer pay for it, causing even more unnecessary inflation.

The "AI" bubble will burst this year, I'd put money on it if I had any.

The last time we saw a bubble like this was "Web3" and we all know how that turned out.

[-] FaceDeer@kbin.social 3 points 4 months ago

I'm in a similar place to you career-wise. Personally, I'm not concerned about becoming just a "debugger." What I'm expecting this job to look like in a few years is to be more like "the same as now, except I've got a completely free team of "interns" that do all the menial stuff for me. Every human programmer will become a lead programmer, deciding what stuff our AIs do for us and putting it all together into the finished product.

Maybe a few years further along the AI assistants will be good enough to handle that stuff better than we do as well. At that point we stop being lead programmers and we all become programming directors.

So think of it like a promotion, perhaps.

load more comments (5 replies)
[-] Damage@feddit.it 3 points 4 months ago

If this follows the path of the industrial revolution, it'll get way worse before it gets better, and not without a bunch of bloodshed

[-] bruhduh@lemmy.world 3 points 4 months ago

Imagine it's like having intern under you that helping you with everything, quality of the code will still be on you regardless

Our company uses AI tools as just that, tools to help us do the job without having to do the boring stuff.

Like I can now just write a comment about state for a modal and it will auto generate the repetitive code of me having to write const [isModalOpen, setIsModalOpen] = useState(false);.

Or if I write something in one file it can reason that I am going to be using it in the next file so it can generate the code I would usually type. I still have to solve problems it’s just I can do it quicker now.

[-] cosmicrookie@lemmy.world 4 points 4 months ago

But thisbis OPs point. People are getting fired from tech companies because they don't need as many people any more. Work is being done faster and cheaper by using AI.

[-] l0st_scr1b3@beehaw.org 3 points 4 months ago
load more comments (1 replies)
[-] Hestia@lemmy.world 3 points 4 months ago

I've been messing around with running my own LLMs at home using LM Studio and I've got so say it really helps me write code. I'm using Code Llama 13b, and it works pretty well as a programmer assistant. What I like about using a chatbot is that I go from writing code to reviewing it, and for some reason this keeps me incredibly mentally engaged. This tech has been wonderful for undoing some of my professional burnout.

If what keeps you mentally engaged does not include a bot, then I don't think you need any other reason to not use one. As much as I really like the tech, anyone that uses it is still going to need to know the language and enough about the libraries to fix the inevitable issues that come up. I can definitely see this tech getting better to the point of being unavoidable, though. You hear that Microsoft is planning on adding an AI button to their upcoming keyboards? Like that kind of unavoidable.

[-] yogthos@lemmy.ml 2 points 4 months ago

I'm not really losing any sleep over this myself. Current approach to machine learning is really no different from a Markov chain. The model doesn't have any understanding in a meaningful sense. It just knows that certain tokens tend to follow certain other tokens, and when you have a really big token space, then it produces impressive looking results.

However, a big part of the job is understanding what the actual business requirements are, translating those to logical steps, and then code. This part of the job can't be replaced until we figure out AGI, and we're nowhere close to doing that right now.

I do think that the nature of work will change, I kind of look at it as sort of doing a pair programming session. You can focus on what the logic is doing, and the model can focus on writing the boilerplate for you.

As this tech matures, I do expect that it will result in less workers being needed to do the same amount of work, and the nature of the job will likely shift towards being closer to a business analyst where the human focuses more on the semantics rather than implementation details.

We might also see new types of languages emerge that leverage the models. For example, I can see a language that allows you to declaratively write a specification for the code, and to encode constraints such as memory usage and runtime complexity. Then the model can bang its head against the spec until it produces code that passes it. If it can run through thousands of solutions in a few minutes, it's still going to be faster than a human coming up with one.

load more comments
view more: ‹ prev next ›
this post was submitted on 05 Feb 2024
198 points (83.9% liked)

Asklemmy

42432 readers
2516 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS