馃懡 defunct

I used to store all my projects on GitHub. when I stopped that, I created a bare repo in my Dropbox. now with minimal hardware, there is no way to connect to Dropbox. how do you backup your code well then? without complicated network storage and stuff?

3 years ago

Actions

馃憢 Join Station

15 Replies

馃懡 defunct

either that or use a small pi to ssh into that has bare repos... thanks for all your suggestions! I think they are giving me enough inspiration to find a solution 路 3 years ago

馃懡 defunct

i came across someone else who has the issue and he wrote a little script that tars the .git folder and pushes it through curl to a specific Google drive folder. replaces the current tar there. on download he extracts the tar and resets the head. this backfires when you develop code in multiple places, but I find it quite smart to push a tar onto Dropbox and let them do versioning. 路 3 years ago

馃懡 sdfgeoff

I should also add that gitea is very painless to set up/use. A single executable for a github-like interface.... 路 3 years ago

馃懡 sdfgeoff

Any git repo is a 'host' that you can push to. This means you can, for example:

- push to a network drive mounted with ftp/sftp/samba

- Push to another folder on your PC

- Push to a folder on a USB drive

- push to a folder on your PC that happens to be in dropbox.

etc.

I suggest setting up ssh access to a raspberry pi, and using ssh to push to it. Its simple and robust. 路 3 years ago

馃懡 lykso

@defunct I imagine the closest thing to what you had with Dropbox would be setting up a server running SSH with a bare repo and then pushing to/pulling from that repo over SSH. I'm having a hard time understanding what counts as "complicated network storage" in this scenario TBH. If there's something simpler than rsync, or scp, or git over SSH to a computer on your local network, I'm unaware of it. (Piping the output of tar through netcat, maybe? Only if you don't mind your data being unprotected over the wire and transferring redundant data, of course.) 路 3 years ago

馃懡 defunct

@lykso I avoid http because this doesn't belong there, nor does it provide any benefit. yes I eventually need to run it across a network, but ultimately I have a few backup storages, and I need to get it in there, and back out. over http I'll have dozens of services in the end, and none are about html, all are a frontend to manage a different format 路 3 years ago

馃懡 lykso

Why HTTP in particular? You could run a job to push everything to a local machine over SSH pretty easily, right? Are you trying to avoid networks entirely? Because at that point, I think you're stuck attaching a USB drive and copying things over every so often. 路 3 years ago

馃懡 defunct

@iam I want to avoid anything on http, especially a service 馃檲 路 3 years ago

馃懡 iam

RocketGit instead of GitHub? I am really happy with it 路 3 years ago

馃懡 defunct

good point, why not run a pi zero 路 3 years ago

馃懡 marginalia

I used to run against a raspberry pi with a large-ish USB memory. That works too I guess. The thing with git is that it's a distributed system. Every checked out version has all the information, so if the server burns, you can reconstruct it from your workstation's version. 路 3 years ago

馃懡 marginalia

I have my repos on my workstation, and comitted to my server (i.e. the one that's hosting marginalia.nu). The server also takes daily backup snapshots of ~git in case of a disk failure on the root disk. 路 3 years ago

馃懡 defunct

so essentially you're all using some form of network storage 路 3 years ago

馃懡 isoraqathedh

I rely on a friend to host a gitea instance for me, but I also keep my own remotes in my home NAS and external hard drive. Designating a home computer and connecting via VPN is essentially what I did when I was international. 路 3 years ago

馃懡 wim

I use "rsync -avrz ./ yourserver.com:$PWD/;" aliased in bashrc as backup so it backsup the current directory. 路 3 years ago