💾 Archived View for dioskouroi.xyz › thread › 24916321 captured on 2020-10-31 at 00:52:07. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
I really don't understand the idea of a C/C++-specific package manager. C is the baseline API for most systems. Your system package manager is already installing C/C++ libraries and headers. Most platforms even have standard paths for storing C/C++ headers and libraries.
Also as a C/C++ guy I could care less if the libraries are written in Brainfuck for what is worth, as long as their API stays usable from C, which is pretty much guaranteed. So the idea of a "C/C++ package manager" seems strange, and even trying to create a barrier where there was none.
Sure there is a barrier and it is the same as for other programming languages.
If you rely on your package manager, you are basically stuck with whatever was in the previous LTS of ubuntu when it was shipped.
If you want the newest library, then you are either dependent on some external proper packager, or you have to get into packaging yourself. I don't consider picking a random external source something I would do in a professional context.
But if we open up the choice of library versions from "non-standard" versions, the problem becomes a transient one. Team A in location B needs at least version X, but not version Y, team C in location D won't do with version Z.
To give that kind of flexibility, people went for build from source instead (Go, Java,...)
And in every shop I have worked until now, "your package manager" meant usually having to support at least one linux distribution in two versions, Windows and Mac, which do not have a native package manager.
> If you rely on your package manager, you are basically stuck with whatever was in the previous LTS of ubuntu when it was shipped.
This is not exactly true. You are free to put together your company's PPA, and just package your stuff as you see fit. You know, use the system's package manager to serve your and your user's interests.
> If you want the newest library, then you are either dependent on some external proper packager, or you have to get into packaging yourself. I don't consider picking a random external source something I would do in a professional context.
If you're in a "professional context" then you're already in the business of packaging and distributing software packages.
> You are free to put together your company's PPA, and just package your stuff as you see fit.
I think, I've addressed that point and it's downsides.
> If you're in a "professional context" then you're already in the business of packaging and distributing software packages.
Nowadays, you often aren't. A lot of companies switch to the software-as-a-service model and it's distributing and packaging and distribution are two of the major problems solved by that.
"Most platforms" unfortunately excludes Windows, which is too popular to ignore.
Among the available platforms, Windows is by far the less picky about libraries and project setup. You pretty much only need to dump all interface headers in any part of your file system, dump your libraries somewhere else, and then point your project files to it. Afterwards you just need to ensure you nearly bundle your DLLs, and you're all set.
Sure, but you get that sort of "freedom" on any other platform too. Unlike UNIX-y operating systems, Windows has no notion of a standard location for headers and libraries which package managers and compiler toolchains all agree on.
This kind of ignores the realities of some big projects, i.e. managing dependencies automatically in Qt (and in a licence conforming way) is an absolute pain.
I don't want to try every single combination of static/shared etc. to find what works, that and deploying the executable, again, should be automated.
QMake is old but functional, CMake is more up to date but a total pain if you use it irregularly. I've never used Conan so I can't comment on it.
It's slightly ridiculous that this is still a pain point when Rust has cargo and D has dub - both of them not perfect, but totally functional and basically painless for the most part.
> QMake is old but functional, CMake is more up to date but a total pain if you use it irregularly.
In my experience it's the exact opposite. Cmake can be a real treat to use, provided you follow the so-called modern cmake approach, and qmake is just an unmaintainable mess of a system.
> I've never used Conan so I can't comment on it.
Conan is pretty much orthogonal to any build system. Conan's responsibility is pretty much limited to managing the project's third-party libraries: it handles downloading their interfaces and libraries, pass their settings, and just integrate them into your project with as little code as possible.
There is indeed some modules for CMake that offer some limited functionality that allows third-party packages to be integrated, such as the ExternalProject module. However, those were an afterthought that only add complexity to a project.
There are two uses cases for package managers.
One is as an end-user to install software, another as a developer to have a set of specific libraries to use.
For the latter, the version used varies per project. The former is typically system wide.
For my hobby projects, I indeed use cmake to generate .deb packages which I then use in other projects.
For professional development I've never seen this work successfully.
Your system package manager typically also sucks for keeping multiple versions/variants of libraries around, and library developers aren't particularly interested in providing packages for many different systems. Not all platforms even have a system package manager you can easily add your own packages to.
I think Nix does a pretty good job at this. I've used it on Ubuntu, macOS and NixOS (with the latter two sharing the same system config, with OS-specific pieces tucked into if/else expressions).
I think of Nix more as a build system than a package manager (and in fact I've come to think of "package managers" as a sub-set of build systems, mostly used to work around deficiencies in Make which make it unsuitable for building whole operating systems via composition of their Makefiles).
One thing that Nix emphasises is that users should not be installing dependencies, like libraries; let alone in a globally visible, ambiguous location like '/usr/lib' (or /usr/bin, etc. if it's a tool that we're depending on). Dependencies are only needed 'ambiently' in a package's build environment; build products can embed references those by absolute path if they like, but they shouldn't be 'propagated' into the final system.
> Your system package manager typically also sucks for keeping multiple versions/variants of libraries around
They really don't. At most, what really sucks is the way some projects are bundled. Unfortunately the way Qt has been packaged in Debian/Ubuntu is one of these examples on how not to package software. Qt ships and maintains multiple release versions concurrently, and it's a shame that package managers insist on pretending they only need to have one semver major version installed at all times.
Linux package managers have no problem with multiple versions of a library:
https://unix.stackexchange.com/a/209712
Linux package managers (that I'm aware of) tend to dislike keeping multiple versions of development headers around, though; they have no issues keeping multiple versions of the libraries themselves. That seems relevant in a discussion about a development library.
I understand that nix might be a notable exception here.
> Linux package managers (that I'm aware of) tend to dislike keeping multiple versions of development headers around, though;
They really don't. At most, those doing the packaging lack the insight to allow users to install separate versions of the same package. You only need to allow your package to not reuse another release's path, and allow users to use a generic major/minor version of a package, and that's it.
Isn't Docker a thing mainly due to the hassle that it is to manage libraries and headers in C/C++?
That is not C/C++ specific, from what I saw, most "dynamic" languages picked it up rather more enthusiastically. Resolving the conflict of different library dependencies that arises in almost all languages. And with docker I can solve it in a uniform way. If you install something in the system, it will mess it up for others. Some languages do have "native" ways to avoid that, e.g. python has virtualenv, ruby has bundler, java classpaths?... but I still can mess up the system. And it isn't like there are different ways how it is then solved.
But with docker, I package and isolate the whole OS. So, as a consumer, I do not have to know how to do that for each language, and as a producer, I can do (almost) whatever I want within the container without fearing to mess up other containers or the hosting OS.
Yes, this whole approach has its own downsides. But those things are a plus, I would say.
Honestly that all seems to mostly derive from C/C++ issues, doesn't it? Managing Python and Ruby is pretty easy until you hit a library with OS dependencies.
But I'm there with you, I love Docker. But it is a solution to problems we should not be having after 30-ish years.
> Managing Python and Ruby is pretty easy until you hit a library with OS dependencies.
And that's why there is practically only one solution to solve it in either language (/sarcasm off).
Docker in my mind wasn't a sensible choice for compiled languages prior multi-stage builds, or better buildkit.
> Honestly that all seems to mostly derive from C/C++ issues, doesn't it?
It derives from having binary libraries, and a plethora of ABIs. And contrary to Go advocats and docker fans, I do believe they have their place. There is no perfect technical solution, there is always a trade-off.
With compiled languages you deliver binaries. A development / delivery split is expected.
The intermediate data (object files + debug data) often need a lot of storage, and are not supposed to be shipped directly. With a binary interface (ABI), you simply can ship your shared object or executable individually.
With statically linked/compiled binaries, you have to ship every binary using the fixed library. Or in case of docker, it will likely affect a lower layer which will invalidate all the upper layers of each docker image using said library.
So a couple of bytes delta in a couple of KB library can mean a couple of KB binary package, or a couple of multi-mb binaries or layers containing all the upper parts.
Your OS is most likely working with shared objects/DLLs and their ABIs, and there is a reason for that. Your use-case might not demand that, and that is okay, but it doesn't invalidate that there is a sensible reason for doing it the way it is done.
> Honestly that all seems to mostly derive from C/C++ issues, doesn't it?
No, it's not. It's exclusively derived from the fact that there is no package manager. Python has a few package managers. Node.js also has package managers, and Rust has too. In C or C++ the only package manager there is is a linux distro's own generic package manager, which isn't exactly designed to handle independent projects.
There is no coincidence in the fact that node.js's npm and conan are services offered by private for-profit companies which have a free tier and paid tier targeting enterprise customers. It's no coincidence because packaging and distribution is a generic software engineering need that is not tied to a specific programming language, and is value-added that, done right, is worth paying for.
The value of Docker is not packaging per se. It's like a linux installer: yeah it's great that they come bundled ina neat ISO image that you can burn/copy/run in place, but that's not it's value proposition. It's value proposition is that you have a self-contained software packages that can be executed at will free from side-effects.
Couldn't care less. Why do people keep doing this?
This seems a bit strange. Conan's client is open source, and the repositories are decentralised, but the repository server it all depends on, JFrog Artifactory, is proprietary.
They have an open source just conan server [1] and artifactory has a community edition which is open source [2].
1:
https://docs.conan.io/en/latest/uploading_packages/running_y...
2:
https://jfrog.com/open-source/
It's the same with npm for Node.js, there is a company behind the package manager.
oh yes!
CMake and qmake is not enough. Adding Ninja...
> CMake and qmake is not enough. Adding Ninja...
Your comment makes no sense at all. Both cmake and qmake are competing build systems, and qmake already lost and was superceded by cmake.
Cmake works as a high-level makefile generator, which pretty much outputs the DAG of a project's build targets in any of the supported lower-level systems. Right now it supports Make and also Ninja.
Cmake's support for ninja is a nice-to-have, just like it's support for Visual Studio and Xcode project generation. No cmake user was ever blocked by cmake's lack of support for ninja. In fact, no one barely notices any change if you replace make with ninja. In fact, I was surprised to know that Qt Creator started configuring Cmake projects to use Ninja by default because things just worked the same anyway, and honestly I didn't bothered to change anything because there was no discernible difference at all.
Still not enough... adding Conan and QML...
How about you switch to a free licence such as a two-clause BSD or an MIT licence?
If you pay them enough ... they seem to struggle and are searching for ways to get revenues.
Alternatively they have to go bankrupt, then the agreement with KDE Kicks in and it becomes BSD
https://kde.org/community/whatiskde/kdefreeqtfoundation/
(but in that case one has to ensure somebody continues working on it ... no idea if KDAB and KDE have enough manpowwer ...)
With Electron (MIT licence) the writing has been on the wall for Qt for years now. They are on borrowed time.
I am not saying I like Electron better. I prefer Qt, even though I don't consider it to be native either. But Qt would not have died had they relicenced _earlier_. It's evident they will have to do it eventually. Why wait until there's no more lunch to be eaten?
"They" (Qt Company) are a commercial enterprise that mostly makes its money from license sales. Relicensing as MIT would be terrible for them. From their product policy you can clearly tell that even LGPL isn't working as they would like, but they can't get rid of that, so they focus in fields that don't like LGPL either. That's of course annoying for those of us who don't optimize for "what makes them money", but for "how can we build software best"...
How do you know it would have died? - Mind that their business is in embedded stuff, an area where Electron is no competition (putting a bit more hardware in and embedding android might be competition, but Qt runs on quite small systems)
https://www.qt.io/product/develop-software-microcontrollers-...
As an outsider it seems like the "Desktop" side seems to be a marketing vehicle, but not their bet for business (see also push for QML over QWidgets - less integration with desktop and less "native" looks, but working on cheaper hardware)
> Electron (MIT licence)
It's not 100% MIT. Chromium's Blink renderer has LGPLv2.1 code going back to the KDE KHTML days, such as some of the code for the DOM:
https://github.com/chromium/chromium/blob/master/third_party...
Very rough way of going about it, but searching for files with copyright headers crediting KDE shows 832 files
https://github.com/chromium/chromium/search?l=C%2B%2B&q=kde....