Last week we had a meeting of our local developers group, when various things were discussed (including a Summer of Code presentation by yours truly). One of the things that came up was the dislike of some in the room for Linux package management, the main issue being raised that ‘you have to resolve a million dependencies when you install something’. I disputed this, but on reflection it really does depend on what distribution you’re using and how you manage it. The fact that our local IT scene is heavily dominated by Windows, with Linux being mostly used in small pockets like telecoms, means that most have had relatively little exposure to Linux and thus it’s pretty understandable that the experience has generally been that it looks hard to manage.
The ‘trouble’ is that Linux offers you so many options to choose from when it comes to version management, and updates to the software used in them are available thick and fast. This is just par for the course in the open source community, where the mantra is ‘release early, release often’. Essentially the whole open source community is running on Agile development principles - and this can be disconcerting when you’re used to the much slower pace of updates on Windows (excluding security fixes). The thing is, and this is where the ‘dependency hell’ perception comes from, that most desktop distributions plug you fairly directly into that update stream, meaning that yes, you can often get bombarded with huge dependency updates on a regular basis. I can see how that would put off some Windows server admins, thinking about the risk level of that number of major updates.
Desktop distributions are not designed for mission-critical environments. For those, you want a much more static configuration with just security and bugfix updates in the main, with the emphasis on “if it ain’t broke, don’t fix it”. There are distributions that support that philosophy, and my personal favourite is Debian. The apt-get package manager is simply unrivalled, especially when combined with other helper tools - one that’s on every server I have is Wajig, which is very quick and command-line friendly. Aptitude is also useful for beginners since it’s menu based while still not requiring X - many of my servers are remote and I like to be able to administer them all over a simple, fast, SSH console; a GUI is just baggage I don’t need. There are GUIs for those who are surgically attached to their WIMP environments though.
Dependency hell is really not an issue for me - you in fact have total control over what you upgrade and what you don’t, either by locking particular packages or even simpler, locking yourself to a major version. Debian has always updated slower than other distros anyway (barring security updates), putting the emphasis on stablility which is why I like it for servers, but you always have the option of picking between major branches - ‘stable’, ‘testing’, or ‘unstable’, depending on what combination of new features versus stability you want. And when a new stable is released, as it was earlier this month (Debian 4.0 aka ‘etch’), the sudden influx of packages can be avoided if you need to. For example my local server here is on Debian 3.1 (sarge) and I didn’t want to do a major upgrade on it just yet, but needed to install some new packages. At first by default I got all the ‘etch’ updates showing up (several hundred), but a quick change of /etc/apt/sources.list to specify my preference for ‘sarge’ rather than ‘stable’ (which has now changed version) was all it took to get me back to simple progressive updates. When I feel like it I can switch over to etch but right now I have other things to do. RHEL is another system designed for servers which favours underlying stability and security updates over major changes. Their official up2date manager is nowhere near as good as Debian’s apt-get though (although you can plug in apt-get if you want).
In conclusion, I can see why people at our meeting got the impression that package management could be a nightmare on Linux, but I think it’s just down to the fact that desktop distros aren’t built for stability, and update major versions far more often than you’d want on a critical production system. Since there is no ‘one size fits all’ solution in the Linux space like there is with Windows, it’s up to the user to pick the distribution / major version that reflects their position on the stability / new packages curve, which will of course vary depending on the puspose of the machine. I’ve personally run Debian for several years on systems that are critical to me (the OGRE server and my own local file / mail / web server) in ‘stable’ mode and have never had any problems.