💾 Archived View for station.martinrue.com › defunct › ffcd71be4a224ed8af44aaa72c850aec captured on 2024-07-09 at 03:20:06. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
I used to store all my projects on GitHub. when I stopped that, I created a bare repo in my Dropbox. now with minimal hardware, there is no way to connect to Dropbox. how do you backup your code well then? without complicated network storage and stuff?
2 years ago
either that or use a small pi to ssh into that has bare repos... thanks for all your suggestions! I think they are giving me enough inspiration to find a solution · 2 years ago
i came across someone else who has the issue and he wrote a little script that tars the .git folder and pushes it through curl to a specific Google drive folder. replaces the current tar there. on download he extracts the tar and resets the head. this backfires when you develop code in multiple places, but I find it quite smart to push a tar onto Dropbox and let them do versioning. · 2 years ago
I should also add that gitea is very painless to set up/use. A single executable for a github-like interface.... · 2 years ago
Any git repo is a 'host' that you can push to. This means you can, for example:
- push to a network drive mounted with ftp/sftp/samba
- Push to another folder on your PC
- Push to a folder on a USB drive
- push to a folder on your PC that happens to be in dropbox.
etc.
I suggest setting up ssh access to a raspberry pi, and using ssh to push to it. Its simple and robust. · 2 years ago
@defunct I imagine the closest thing to what you had with Dropbox would be setting up a server running SSH with a bare repo and then pushing to/pulling from that repo over SSH. I'm having a hard time understanding what counts as "complicated network storage" in this scenario TBH. If there's something simpler than rsync, or scp, or git over SSH to a computer on your local network, I'm unaware of it. (Piping the output of tar through netcat, maybe? Only if you don't mind your data being unprotected over the wire and transferring redundant data, of course.) · 2 years ago
@lykso I avoid http because this doesn't belong there, nor does it provide any benefit. yes I eventually need to run it across a network, but ultimately I have a few backup storages, and I need to get it in there, and back out. over http I'll have dozens of services in the end, and none are about html, all are a frontend to manage a different format · 2 years ago
Why HTTP in particular? You could run a job to push everything to a local machine over SSH pretty easily, right? Are you trying to avoid networks entirely? Because at that point, I think you're stuck attaching a USB drive and copying things over every so often. · 2 years ago
@iam I want to avoid anything on http, especially a service 🙈 · 2 years ago
RocketGit instead of GitHub? I am really happy with it · 2 years ago
good point, why not run a pi zero · 2 years ago
I used to run against a raspberry pi with a large-ish USB memory. That works too I guess. The thing with git is that it's a distributed system. Every checked out version has all the information, so if the server burns, you can reconstruct it from your workstation's version. · 2 years ago
I have my repos on my workstation, and comitted to my server (i.e. the one that's hosting marginalia.nu). The server also takes daily backup snapshots of ~git in case of a disk failure on the root disk. · 2 years ago
so essentially you're all using some form of network storage · 2 years ago
I rely on a friend to host a gitea instance for me, but I also keep my own remotes in my home NAS and external hard drive. Designating a home computer and connecting via VPN is essentially what I did when I was international. · 2 years ago
I use "rsync -avrz ./ yourserver.com:$PWD/;" aliased in bashrc as backup so it backsup the current directory. · 2 years ago