this post was submitted on 12 Nov 2023
24 points (92.9% liked)

Selfhosted

39151 readers
378 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have about 8TB of storage that is currently only replicated through a raid array. I occasionally sync that to another USB drive and leave that in a fireproof safe (same location).

I'd really like to do an offsite backup, but I only have 10Mbps upload. We are literally talking months to do a full backup.

How do others handle situations like this?

you are viewing a single comment's thread
view the rest of the comments
[–] tal@lemmy.today 6 points 10 months ago (4 children)

I don't do offsite backup, but if your backup system supports it, you could physically take your backup drive to some location with a lot of bandwidth to toss the initial full backup up there.

[–] nix98@lemmy.world 3 points 10 months ago (2 children)

Yeah, that is what I am thinking. I am using duplicity for backups, so I can probably back up to a hard-drive, take that to work, sync it to my backup provider, then just do incremental backups from then on.

However, I think duplicity really wants to do full backups every X months, so I'm not sure the best way to handle that.

[–] tal@lemmy.today 2 points 10 months ago* (last edited 10 months ago)

I don't know whether it's important to do a full backup for performance, but you'd need to do that if you wanted to remove old backups. It looks like the term for a full backup that reuses already pushed-over data is a "synthetic full" backup, and duplicity can't do those -- when it does a full, it pushes all the data over again.

I have never used it, but Borg Backup does appear to support this, if you wanted an alternate that can do this.

EDIT: However, Borg requires code that runs on the remote end, which may not be acceptable for some; you can't just aim it at a fileserver. duplicity can do this.

I have also never used it, but duplicati looks to my quick glance to be able to work without code running on the remote end and also to do synthetic full backups.

[–] CmdrShepard@lemmy.one 1 points 10 months ago* (last edited 10 months ago) (1 children)

Another alternative is to setup a backup server at a willing friend/family members house so that you can physically take the drive and just upload any new changes later.

[–] RvTV95XBeo@sh.itjust.works 2 points 10 months ago

Or, pick up 2 backup drives, keep one at a friend/relative's house, then just swap them every time you visit.

I keep a drive at my parent's house in case of emergencies. Backup frequency is essentially every few months, but I also have the local portable drive with real-time sync I can snag on my way out.

load more comments (1 replies)