💾 Archived View for gmn.clttr.info › en › hosting.gmi captured on 2023-05-24 at 17:41:44. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
Although it served me well i don't use this setup anymore. I switched to a mix of hosted VPS and Raspberry Pi for no specific reason other than i wanted to try some things on the Raspi. I keep this description online for the time being if anyone is interested.
My gemini capsule - in company with some other services - runs on a server at my home. This site describes the setup and why i've choosen it.
The "server" is based in a small basement room in my house. There's no need for a air conditioning as we have rather stable 16 to 18° Celsius down there.
Connection to the outer world is served by a asynchronous DSL connection with 100 MBit down, 30 MBit up stream.
The machine itself is:
I prefer the KISS principle, thus i use a very simple partitioning scheme with ext4/xfs. No LVM, btrfs or other fancy tools - the benefits of this are not relevant for my use case. In case of a failure it is very important for me to be able to access the data without complicated toolings. The failure is enough stress, no need to increase it even more with tools i'm not familiar with.
orrg - online feed renderer for gemini
Database choice
When running services anyone will eventually come to the point to persist data and therefor needs some kind of database.
For the services mentioned above i've opted for the serverless, light weight SQLite option. I think this database is a great way for small or medium traffic services with mostly read access. It does not require any additional server software to be set up and can be backed up liked any other file.
There are obviously some shortcomings to this concept. Write-intense use cases with heavy concurrent writes won't be suitable. You can't have advanced features like "point in time" restore or a high availability setup.
But all of this feature have absolutely no meaning for my small-sized private hosting. For this use case, the additional complexity of running an entire DBMS is much more burden then the restrictions of SQLite.
Everyone who ever suffered data loss (and a few who don't) might be aware that a valid backup strategy is essential.
My backup strategy is split in 2 parts:
I have 3 hard disks which hold a rotating backup that is created once a week. The disks rest in a fire-prove safe in my house.
The data is rsync'ed to the disks. This results in full-backups reaching back 3 weeks which can be accessed directly by attaching the disk to an arbitrary system - no sepcial tooling needed.
The "cloud" backups are created automatically every night. It creates a full-backup every month and incremental backups in between. These backups are compressed and encrypted.
If opted for duplicity as a long-standing, proven backup solution which suits my use case.