Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
Keep in mind this could drive up your energy bill.
Shouldnt do so that bad. my raspberry pi 4b can do jellyfin and nextcloud without pushing 15W at full load.
x86 is inefficient, especially older models, but youll likely only push anything over 10W when actually streaming something that requires transcoding. Most of the time your home server is gonna sit idle or doing some tiny cron job that won't really blast the CPU at all.
It totally doesn't
I'm running a 14th gen i9 with a 4080. It's a power hungry boy. 1500w power supply. Generally using about 600-800w.
Running this 24/7 costs me <$10/month in electricity.
The old compaq presario with a Pentium II that probably pulled down 100w running Ubuntu server as described here made no statistically significant change in my electric bill. That is to say, it's about as much change as being good or bad at turning off your lights when you're not using them. It's negligible.
At 600 watts running continuously wouldn't that be 432 kWh a month?
Assuming you didn't mean you were running your gaming computer as the server.
At 100 watts that comes out to 72 kWh, in CT where I live rates are waaaay higher then what I calculate your rate to be (around 2.5 cents per kWh)
For me a 100 watt server is about ~$22 a month to run.
Are you sure your paying 2.5 cents per kWh?
Idk what I'm paying per kwh, I am just going off my monthly bills.
There are other power fluctuates, I'm sure. I pay it no mind I just look at the bill. 🤷♂️
So far no bill has arrived that made me change behavior.
Edit: I've also never measured what my machine actually pulls down continuously/when idle. I just know that it's components demand that range, and that I need the headroom in my power supply for spikes.
That's great if it works for you. However, a lot of us don't want the bigger power bill. It also has the problem of heating everything up.
I like CPUs with lower TDPs
No "old i7" as I suggested, is going to meaningfully increase the temp of your room if it has any cooling solution in place.
Your stubbornness around a perfectly practical solution is absurd. I won't bother convincing you further -- it's the obvious cost effective solution.
The problem is it isn't cost effective with electricity. You can pick up a CPU that is more efficient.
I'm not saying your wrong but what your describing is not great for some people including myself.
Your not wrong but there are also trade offs
It is still by far most cost effective.
Your argument amounts to nothing.