Why I dislike Containers and per program packages
One "recent" trend is to more more and more packages into stand-alone containers.
The popular ones include docker, snap, Flatpack, but also I want to mention things like npm, pip and others.
From a security and administration perspective I see them as a huge security risk. Let me explain.
What is a container
Basically it is a package where the service or program includes all needed files to run. But, depending on the actual container format, it comes with more or less of it dependencies.
Dependencies are things like shared libraries, which are no longer shared, because they are bundled and come with the container.
I'll include things like pip, npm and similar into a container, because they basically do the same, each package contains a copy of those.
What are containers good for
In my humple, grumpy opinion: To be lazy developers/maintainers.
But in the more open definition: You can update your application and it's dependencies without changing the ones used by the operating system and other apps. But this is the the bad thing, see the "bad" section later.
It is good if you, lets say want to upgrade a fast moving application like a web browser, so it does not get slowed down by the sometimes really slow update rates of the system libraries.
Why I dislike them
On a system, I do a OS update through the package manager on Linux and I expect that every program now uses the new and fixed version of the library. Let's say you have a security issue in a often used library like OpenSSL. I run the update, get a fixed version. I restart all applications that uses OpenSSL and I'm done. Well almost, only when I don't use docker or similar. If I use them, I need to update each and every one of those containers. That takes a lot of time, sometime they just don't have an update, because the docker image is not updated with the new library. That way I have a perfectly patched and secure base system, but the docker wrapped application X is still unfixed and can be used to compromise the system (or at least it's environment).
Even worse is the npm/pip/<insert shadow package manager of your language here>. By not using system libraries but a shadow version with insane dependency trees, most of them never see an update.
Even my development environments are a royal pain, you know where, to update. No, a simple npm update does not work, even worse, I add a new dependency for my Javascript app and the newest version available uses a dependency that is already marked as deprecated and now my setup has "3 high security vulnerabilities" and no way to get rid of that.
But yeah, this eases development, because the dependencies and (no longer shared) libraries a.s.o. are not changed at any moment. And I, as the developer, don't have to test it.
The result is, what was previously called DLL hell in the windows world.
Also it wastes storage space for multiple version, sometime of full Linux distributions (docker), as well as RAM, because I have 3 different versions of the shared library active. And yes, for me this matters, I do server stuff where every GB of storage costs money and more RAM is way more expensive. A lot of stuff I do runs on Cloud-VM with 1GB of RAM.
Sorry, no thanks.
I try to avoid those as good as possible. If an install manual tells me to run "pip install" to create a shadow environment, I start the application anyway and install as much via the system package manger. Yes, sometime this means patching of the source to make it work with newer/different version of libraries. Also that way I'm more protected against security issues.
So my servers are docker, snap or flatpack free. I only use docker for compiling stuff for different distributions (like the servers).