💾 Archived View for gemspace.observer › tb_notes › 2021-02-12-file_transfer.gmi captured on 2023-03-20 at 17:24:33. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2021-11-30)
-=-=-=-=-=-=-
BeerW0lf | 2021-02-12
There are many ways to transfer files between hosts on a network. If you don't have a proper network, you might need to resort to "sneakernet". For those unfamiliar with the term, you slap your data to a physical device, be it a floppy, cdr, usb stick or a tape, and use your feet to transfer it between hosts. There are more sophisticated methods when your hosts have access to a proper network.
These should work if the remote host has an ssh server installed. Nc is an exception, it only requires that you have some kind of access to both hosts.
Just like cp.
# From local host to remote scp /local_path/file username@remote-host.net:/remote_path/ # From remote host to local host scp username@remote-host.net:/remote_path/file /local_path/
Rsync is great if you need to make recurring backups. Only the changes are copied. The following will sync the entire directory recursively. You may want to drop the -z option if you have a fast connection.
rsync -e ssh -azAX /local_path/to/copy username@remote-host.net:/remote_path/to/save/ # Options are: # -A : Preserve ACL. # -X : Preserve extended attributes/SELinux. # -a : Archive mode. # -z : Compress file data during the transfer.
Sftp can transfer files, but it can also be used to navigate the remote file system and make modifications if needed. If you're on a desktop, you can use a graphical client. I've mostly used Filezilla, but there are many to choose from.
If your on a terminal, fear not, it's pretty easy to use as well.
sftp username@remote-host.net # After login you'll be on the remote file system, where you can: # Download file to your local directory, where you started the client sftp> get /path/to/file/on/remote-server/foo # Download to specific directory sftp> get /path/to/file/on/remote_server/foo /local_directory/ # Download whole directory recursively sftp> get -r /remote_servers/directory /local_directory/ # Upload a file from local server sftp> put /local_directory/file /remote_server/ # Upload local directory recursively sftp> put -r /local_host/dir /remote_directory/ # You can also navigate and modify the remote file system sftp> cd /remote_directory sftp> rm /remote_dir/file sftp> rmdir /remote_dir sftp> mkdir /remote/dir sftp> rename file_name new_file_name sftp> chmod 664 remote_dir/or/file sftp> chown username /remote/file/or/dir sftp> chgrp groupname /remote/file/or/dir
When you have a fast network, and secure connections only slow your CPU down, or you're just too lazy to setup user accounts for one time transfer, you'll want to use netcat. Just be aware, that this transfer is NOT ENCRYPTED and should not be used outside local networks.
The following will send the whole directory recursively. The pv pipe is optional, but very nice to have a progress counter for large tarsfers.
# local host tar -cp /local_path/to/send | nc -N remote-host.net 44444 # remote host nc -vl 44444 | pv | tar -xv
💾
--