this post was submitted on 18 Feb 2024
30 points (76.8% liked)

Programming

17351 readers
535 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

For some background, I originally wanted to break into programming back when I was in college but drifted more into desktop tech support and now systems administration. SysAdmin work is draining me, though, and I want to pick back up programming and see if I can make a career out of it, but industry seems like it could be moving in a direction to rely on AI for coding. Everything I've heard has said AI is not there yet, but if it's looking like it hits a point where it reaches an ability to fully automate coding, should I even bother? Am I going to be obsolete after a year? Five years?

you are viewing a single comment's thread
view the rest of the comments
[–] agressivelyPassive@feddit.de 0 points 8 months ago (9 children)

The real question here is: how much "coding work" is there left to do?

Currently, the bottleneck is available developers (barring short term problems). Even if AI would make every developer 30% more efficient, there would still be work to do. But there will be a point, where this tipps over. At some point, there's no additional demand anymore. We just don't know, when this will happen. 50%, 100%, maybe 700%?

One thing to keep in mind is, that AI code doesn't have to be good, just good enough. Many nerds seem to think that efficiency, beauty or elegance have value. But to a business, that's just a collateral benefit. Software can be buggy, slow, hard to update. That all doesn't matter, if the results and costs are in a good-enough ratio.

[–] MagicShel@programming.dev 25 points 8 months ago (7 children)

How much coding work is left to be done? Infinity. There will always be more needed. Always. And while there is a certain truth to the idea that software just needs to be good enough, it will very quickly become nearly impossible to maintain and add new features.

AI doesn't make us 30% more efficient. There are certain tasks that's it's really helpful for, but they are really limited. I can see issues with junior developers being replaced with AI when they are in the takes more work to train them then just do their job stage. Beyond that, a good developer has skills and experience that AI will never be able to replace, especially since the code has to be maintained.

[–] agressivelyPassive@feddit.de -4 points 8 months ago (6 children)

How much coding work is left to be done? Infinity.

Well, no. That's just plain wrong. There is only a certain amount of demand for software, like for every other product or service. That's literally economy 101.

AI doesn't make us 30% more efficient.

You don't know that. Think about how much time you spend on boilerplating. Not only the "traditional" boilerplate, but maintenance, regular updates, breaking upgrades for dependencies, documentation.

Think about search. Google isn't that good at actually understanding what you want to find, an AI might find that one obscure blog post from 5 years ago. But in 10s, not 10h.

Think about all the tests, that you write, that are super obvious. Testing for http 500 handling, etc.

A technology doesn't have to replace you to make you more efficient, just taking some work off your shoulders can boost productivity.

[–] VoterFrog@kbin.social 13 points 8 months ago (1 children)

One thing that is somewhat unique about software engineering is that a large part of it is dedicated to making itself more efficient and always has been. From programming languages, protocols, frameworks, and services, all of it has made programmers thousands of times more efficient than the guys who used to punch holes into cards to program the computer.

Nothing has infinite demand, clearly, but the question is more whether or not we're anywhere near the peak, such that more efficiency will result in an overall decrease in employment. So far, the answer has been no. The industry has only grown as it's become more efficient.

I still think the answer is no. There's far more of our lives and the way people do business that can be automated as the cost of doing so is reduced. I don't think we're close to any kind of maximum saturation of tech.

[–] agressivelyPassive@feddit.de -4 points 8 months ago (1 children)

Here again, I think, is a somewhat tech-centric view on economics.

There is only a finite amount of automation demand, simply because human labor exists.

Inside of our tech bubble, automation simply means more "functionality" per person per time unit. What took 10 devs a year yesterday can be done by 5 people in 6 months today. That's all five and dandy, but at some point, software clashes with the hard reality of physics. Software doesn't produce anything, it's often just an enabler for physical production. Lube, or grease.

Now, that production obviously can be automated tremendously as well, but with diminishing returns. Each generation of automation is harder than the one before. And each generation has to compete with a guy in Vietnam/Kenia/Mexico. And each generation also has to compete with its own costs.

Why do you think, chips are so incredibly expensive lately? RND costs are going through the roof, production equipment is getting harder and harder to produce, and due to the time pressure, you have to squeeze out as much money as possible out of your equipment. So prices go up. But that can't go on forever, at Stone point the customers can't justify/afford the expense. So there's a kind of feedback loop.

[–] VoterFrog@kbin.social 7 points 8 months ago

Yes, what I'm saying is that lower costs for software, which AI will help with, will make software more competitive against human production labor. The standard assumption is that if software companies can reduce the cost of producing software, they'll start firing programmers but the entire history of software engineering has shown us that that's not true as long as the lower cost opens up new economic opportunities for software users, thus increasing demand.

That pattern stops only when there are no economic opportunities to be unlocked. The only way I think that happens is when automation has become so prevalent that further advancement has minimal impact. I don't think we're there yet. Labor costs are still huge and automation is still relatively primitive.

load more comments (4 replies)
load more comments (4 replies)
load more comments (5 replies)