this post was submitted on 23 Apr 2025
34 points (97.2% liked)

Programming

19750 readers
95 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

Some thoughts/predictions about how open source developers will be forced to choose their path with GenAI.

Full disclaimer: my own post, sharing for discussion and to find out if anyone has any brilliant ideas what else could be done. It looks like self-posts are okay here but let me know if I'm wrong about that.

top 10 comments
sorted by: hot top controversial new old
[–] onlinepersona@programming.dev 2 points 18 hours ago* (last edited 18 hours ago)

As you said, it's out of the box/bag. The thing I'll push for is open sourcing all code. Being able to copy opensource code and hide it in proprietary code is to me the biggest problem. Were everything opensource, I doubt anybody would bat an eye. "You copied my code and put it out there publicly, free of charge? Good. Do it again".

Personally, I license everything as restrictively as possible for companies and would love an enforcable opensource license that figures out how to make companies contribute back or pay for use of the code.

Anti Commercial-AI license

[–] rimu@piefed.social 3 points 20 hours ago

Increasing amounts of code running on my computer and in the online services I use will be written by generative AI.

Emphasis added by me.

Thing is, it's not black and white most of the time - usually a developer is using Gen AI as an assistant in some capacity. There are a wide range of ways to do that with really big differences in how firmly their hand remains on the wheel of where things are going. Only in the most extreme "vibe coding" scenario would it be fair to characterize the code as "written by AI".

There reaches a point somewhere on the spectrum of dependency on AI where quality would suffer and developer capacity-building would be stunted. Where that point is, is a more productive question than a binary Yes or No to all AI.

[–] lily33@lemm.ee 19 points 1 day ago* (last edited 1 day ago) (1 children)

with a long tail of grumpy holdouts who adhere to free software principles

Nothing in the core free software principles - namely, the four freedoms - actually concerns the development process and tools used - or copyright. It's all about what you can do with the software.

The GPL is more of a "hack" that "perverts" copyright to enforce free software principles - because that was the tool available, not because the people who wrote it really liked intellectual property.

[–] thomask@lemmy.sdf.org 4 points 1 day ago (1 children)

This is a good point. I assumed here that FS advocates will be basically opposed to a technology that serves to incorporate their code into software that does not provide the fundamental freedoms to end users, more than those who license their work permissively. But yes you could imagine an FS advocate who is quite happy to use the tech themselves and churn out code with GPL attached.

[–] lily33@lemm.ee 3 points 1 day ago* (last edited 1 day ago)

The fact is, currently, AI can't write good code. I'm sure that at some point in the future they will - but we're not there yet, and probably have some years still.

Imagine at some point in the future, where an AI can program any piece of software you want for you, and do it well. At that point, the value of code itself will be minimal. If you keep your code proprietary, I'll just get the AI to re-implement the functionality anew and publish it.

Therefore, all code will be permissive open source. There would be no point in keeping anything proprietary, and also no point in applying copyleft. But at this point the copyleft "hack" would simply be unnecessary, so permissive open source would be just as good.

Until then, me not using AI doesn't in any way prevent others from training AI on my code. So I just don't see training on my code as a valid reason to avoid it. I don't use AI currently - but that's for entirely pragmatic reasons: I'm not yet happy with the code it generates.

[–] EnsignWashout@startrek.website 16 points 1 day ago (2 children)

The answer is 2.

Cling to known humans who write their own code.

Snake oil salesmen always encourage the public to bet against the experts, with predictable results.

Someday ethically sourced AI can be used responsibly by trustworthy coders.

But the key is choosing to collaborate with trustworthy coders.

[–] expr@programming.dev 5 points 1 day ago (1 children)

Yep. It does increasingly feel like developers like me who find it deeply disturbing and problematic for our profession and society are going to increasingly become rarer. Fewer and fewer people are going to understand how anything actually works.

[–] limer@lemmy.dbzer0.com 3 points 1 day ago

I think nobody understands exactly how anything works, but enough of us understand our own little corner of tech to make new things and keep the older things going. I’ve been coding for decades, and proudly state I understand about 1% of what I do. This is higher than most

AI will make these little gardens of knowledge smaller for most, and yet again we, as the human species, will forever rely on another layer of tech.

[–] thomask@lemmy.sdf.org 1 points 1 day ago

Do you think there's a way for this to scale to larger projects like Servo? Or will it only work for a few people collaborating?

[–] Reptorian@programming.dev 2 points 1 day ago

I just write in a language few people really knows well. Not like I expect AI to do a great job of basing off my code.