💾 Archived View for remyabel.flounder.online › 2021-08-16-package-management.gmi captured on 2022-04-28 at 18:16:27. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2021-11-30)

-=-=-=-=-=-=-

Package management

I've probably ranted about this before but can't remember. Or maybe I made a draft post and deleted it. Regardless this is going to be about package management with respect to how it's handled by developers.

Namely, I've seen a rising trend in the following:

- unwieldly shell-based installers piped into curl

- developers "owning" package specs

- developers only catering to specific platforms, like Ubuntu

- general complaints about Linux package management

Linux package management may not be perfect, but I find the complaints to be due to a lack of understanding rather than valid criticism. So what exactly is the goal of packagement on Linux (roughly)?

- provide a cohesive set of packages that are known to work together

- quality assurance, have users test it before they're deployed to stable repositories

- apply distro specific patches and hardening (like stack protection, etc.)

- distro specific preferences like removing proprietary codecs or installing things to the right places

At some point developers (read: those who don't use Linux) complain that there are way too many distros to support. This is obviously the wrong way to think about things. If you develop software that is portable enough, package managers can asynchronously package your software without any developer intervention required. Upstream communication, patches and issue reporting is a bonus, but not required. The idea that developers have to support every distro out there is a bit of a strawman and the cause of some bad practices that are prevalent today.

Let's take a practical example. Often times developers will develop games for one platform (like an LTS version of Ubuntu) and either statically link the libraries or ship shared libraries with the game. These libraries are incompatible with the system and are either rarely or never updated. There are multiple problems here:

- it only works for one platform/configuration, the one the developer tested on

- these libraries will likely be incompatible with the system the user is running on

- they may be outdated and contain security vulnerabilities or not have any hardening applied by the distro

A particularly egregious example is OpenSSL, which is one of the most important libraries on your system. You want to be running the one packaged by your distro, not random copies floating around that may be insecure and never receive security updates.

As to why shell-based installers piped into curl is a bad idea, that's already been beaten to death. The aspect I'm more concerned about is this idea that it will make things easier or save developers time, when I find the opposite. These scripts usually end up being hundreds or thousands of lines of code, as they try to accomodate every system possible. This not only makes them hard to audit but also introduces the potential for bugs. However, it's usually hand-waved away with "convenience" and "you can read the script if you want to". Of course, we're ignoring the fact that not only is this a terrible way to install software, but it's also a horribly inefficient way to. Unlike an RPM spec file which is usually short and easy to skim, these shell scripts are hard to read, sometimes automatically generated by other tools, and it's hard to guess what it will do to your system without running it. Bonus points if the install script is just a wrapper around other scripts and you have to hunt those down too.

The moral of the story is, if developers want an easier time with Linux, perhaps they should actually try to learn it instead of being stubborn and chasing these horribly inefficient practices.