this post was submitted on 23 Mar 2025
299 points (97.8% liked)
Technology
68131 readers
3741 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It would be interesting to have encrypted blobs scattered around volunteer computers/servers, like a storage version of BOINC / @HOME.
People tend to have dramatically less spare storage space than space compute time though and it would need to be very redundant to be guaranteed not to lose data.
Oh for sure, that's quite reasonable, though at some point you just move towards re-creating BitTorrent, which will be the actual effect you want.
You could build an appliance on top of the protocol that enables the distributed storage, that might actually be pretty reasonable ๐ค
Ofc you will need your own protocols to break the data up into manageable parts, chunked in a same way, and make it capable of being removed from the network or at least made inaccessible for dmca claims. Things that is completely preventing the internet archive from being too much of a target from government entities.
Yea some kind of fork of the torrent protocol where you can advertise "I have X amount of space to donate" and there's a mechanism to give you the most endangered bytes on the network maybe. Would need to be a lot more granular than torrents to account for the vast majority of nodes not wanting or being capable of getting to "100%".
I don't think the technical aspects are insurmountable, and there's at least some measure of a builtin audience in that a lot of people run archiveteam warrior containers/VMs. But storage is just so many orders of magnitude more expensive than letting a little cpu/bandwidth limited process run in the background. I don't know that enough people would be willing/able to donate enough to make it viable?
~70 000 data hoarders volunteering 1TB each to be a 1-1 backup of the current archive.org isn't a small number of people, and that's only to get a single parity copy. But it also isn't an outrageously large number of people.
You might not necessarily have to fork BitTorrent and instead if you have your own protocol for grouping and breaking the data into manageable chunks of a particular size and each one of those represents an actual full torrent. Then you won't necessarily have to worry about completion levels on those torrents and you can rely on the protocol to do its thing.
Instead of trying to modify the protocol modify the process that you wish to use protocol with.