💾 Archived View for jsreed5.org › log › 2024 › 202404 › 20240404-asynchronous-syncing.gmi captured on 2024-05-26 at 14:55:33. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2024-05-10)

-=-=-=-=-=-=-

Asynchronous Syncing

2024-04-04

---

Recently I've been thinking about how I can reduce my computing and energy footprint. I'm trying all sorts of new workflows, from using low-power devices to removing GUIs on some of them. Where I feel comfortable doing it, I am also removing Syncthing and not have an always-on syncing program running.

I hope to not need Syncthing at all one day. It would be nice to have a store-and-forward or other asynchronous tool, similar to git or NNCP, that can look at the differences between two directories and resolve their divergences. That way I can keep only one or two small servers as file hosts, or even as dumb relays that simply connect machines to each other like croc.

The problem is that I have a lot of files I sync, and some of them change rapidly. I share my password database, my to-do list, several collections of notes, my camera photos, and even backups of video game save files. Not all directories are shared with every device, and not every device peers with every other device. The system is rather complicated, in all honesty.

I see several tools that work similarly to what I'd like to have. git comes close, for example: it requires no central server or relay, as long as the two nodes can connect to each other directly. It can bring diverging directories in line automatically, has a built-in conflict resolution mechanism, and it can even update asynchronously using `git bundle`. However, git keeps a full history of all the files in the directory; I just want to know which files need updating. It would also be nice if git had the ability to pull data from multiple up-to-date sources at once.

rsync is another tool that is similar to what I want. git's commit history makes it easy to see what files need to be updated; rsync, on the other hand, uses incremental file lists, shared in real time with the two endpoints to decide what to sync. rsync would actually come closest if it had the ability to write the incremental list to a file or data stream, and for the remote node to be able to read that file or stream. That would provide the asynchronous aspect I'm looking for. But I don't know of any way rsync can do it without additional scripting.

Other tools begin to diverge from my goals. BitTorrent and croc can use relays, and BitTorrent can even send or receive files from multiple peers at the same time. But neither tool can handle file updates--BitTorrent is specifically designed to preserve files statically. I know Resilio Sync can handle file updates while being based on BitTorrent, but that's a proprietary tool, and naturally I prefer FOSS programs.

There are a few other tools I still need to look into, like Unison. But a cursory glance at most of them tells me they're not exactly what I'm looking for.

Ideally, I'd like to have a workflow like the following:

It sounds like a lot, but many store-and-forward tools offer scheduling to automate tasks like this, if not outright transfer tools of their own, such as NNCP's call command. I'd especially like it if a listening daemon was decoupled from the tool that performs the syncing, as is the case with NNCP and is not the case with tools like Syncthing or Resilio Sync.

If anyone knows of any tools that work like this, I would be very interested to hear about them.

---

Up One Level

Home

[Last updated: 2024-04-04]