this post was submitted on 12 Aug 2023
778 points (96.4% liked)
Programmer Humor
32371 readers
574 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So was all this bloat inevitable as hardware got better, or is there a way to go back? It feels like a ripoff that our computers are 1000x better but they're maybe 10x faster once all the shitty software is taken into consideration.
Perhaps it's kind of inevitable to have some bloat. For example apps these days handle most of the languages just fine including emoji, LTR/RTL and stuff. Some have pretty decent accessibility support. They can render pretty complicated interface at 8k screen reasonably fast. (often accelerated in some way) There is a ton of functionality baked in - your editor can render your html or markdown side by side with source code as you edit it. You have version control, terminal emulator, language servers, etc..
But then there's Electron, which just takes engine capable of rendering anything and uses it to render UI, so as a result there's not much optimization you can do. Button is actually a bunch of DOM elements wrapped in CSS.. Etc.. It's just good enough for the "hardware is cheap" approach.
I think Emacs is a good example to look at. It has a ton of built in functionality and with many plugins (either custom configuration or something like Doom Emacs) you can have very capable editor very comparable to the likes of VS Code. Decades back Emacs had this reputation of being bloated, because it used Megabytes of RAM. These days it's even more "bloated" due to all the stuff that was added since. But in absolute numbers it does not need as much resources as its Electron based peers. The difference can easily be order of magnitude or more depending on configuration.
I am working on an application using DevExpress XAF. It allows you to build a big enterprise application relatively quick by doing a lot of the dirty work you would otherwise do yourself for CRUD stuff. A lot of the application can be modified through mere clicks without touching a single line of code.
It is cool but kinda bloaty. When you simply launch an XAF application, it uses 300 megabytes of RAM. And that's before you even loaded a single byte of business data. You have just reached the login screen.
At least i felt it was "kinda bloaty" until i first booted Void Linux on my gaming PC at home and took a look at htop. IT'S ONLY 400 MEGABYTES AND IT'S READY TO USE! MAYBE ADD 200 MEGABYTES FOR KDE!
ALL THIS BLOATING CANNOT CONTINUE! WE HAVE TO TAKE ACTION IMMEDIATELY OR WE WILL BE FOREVER DOOMED TO UPGRADE OUR RAM LEST WE DON'T WANT EVEN A FUNCTIONING TEXT EDITOR!
I have a few suggestions:
You know, I haven't worked on a super big project, but I feel like every time I've gotten a type error in a static language it's pointed to something wrong with my underlying reasoning.
nope. The bloat is there mainly because it makes the job easier for the devs.
In the short run, yes. In the long run, this just makes a bunch of coders that are now afraid of type declarations, because they were scared away from it with the "what if you have to choose?" tagline, thus making turning back to the proper way of doing things harder.
Can you talk more about this? I've never heard that tagline and can't figure out what it's supposed to mean.
Just from context, I'm guessing it means that you might type things one way and then need to use them another way later, and dynamically typed languages are sold as not having that problem.
I was thinking about this a bit. Does that mean you can develop a piece of software much more cheaply now? I have a fear that companies writing software get a 10% discount from writing bloat, while clients wind up using 10,000% the resources and are just so used to it they don't complain.
It's not really inevitable, it's just a consequence that develops can get away with being lazy because the hardware can cope with it.