Back in June, I quoted Michael Dell as saying “If the Linux desktops could converge at their cores, such a common platform would make it easier to supportâ€¦”. Basically, as an ISV, what option is given to you if your company chooses to support Linux? Do you make an RPM, a DEB, an ebuild, or one of countless other packaging formats? It makes very little business sense to maintain more than one package, let alone for multiple operating systems, and no sense at all to release multiple packages for the same operating system. Something has to give, because ISV’s clearly are not producing software for Linux. Windows strongest point has always been its platform consistency, and backwards compatibility. Linux has in every way imaginable lost this category.
Early December, the Linux Standards Base held a summit in Berlin with representatives from Red Hat, Mandriva, Novell, Debian and Ubuntu to discuss this situation. The end result was a draft API that would enable ISVs to write a single package that would be installable by all platforms conforming to the API specification. One package for everyone using Linux! This is heavy, but seems to have gotten surprisingly little attention. Since the API is still being drafted, widespread support is not in place in the communities, and the earliest fruits of this labor would occur sometime in 2008. The good news is with representation from the top distributions, the adoption needed seems to already be in place. With software vendors forced to alter existing products to be compatible with Windows Vista, more thought than usual might go towards viable desktop alternatives.
One of the concerns already vocalized in the Linux community is that this is a subject that doesn’t need to be addressed. There are already different packaging standards in place (per distribution) that would allow an ISV to distribute software if that company wanted (say, if they were masochists). Others believe ISV’s are not wanted / needed, as they represent proprietary code in an otherwise open system. To these people I would only say that without ISV’s, Linux will never achieve a mass market share, and Linux might have even died a long time ago. Enterprise software is in demand, and no matter how good GIMP gets, it will never be Photoshop, as the analogy goes. Open source software cannot seem to gather the resources needed to compete with profit-driven software – at least on the desktop.
It seems the long silence posed by Michael Dell’s concern is finally being addressed. As Linus vocalized over the recent debate of removing non-GPL’ed modules from the kernel, it isn’t anyones place to dictate how they should be using Linux. If it suites your needs, you are welcome to use it. Some want to see it go enterprise and gain traction on the desktop, others want a completely free system. Both camps want progress, and this is a step in the right direction.
Ben Wrote – “Windows strongest point has always been its platform consistency, and backwards compatibility. Linux has in every way imaginable lost this category.”
This is an outright lie. Linux can run applications that were created in 1991 (the year it was created) as well as many other applications made much longer ago then that. As for backwards compatibilty I would have to say that at best it is a good race, but in reality MS took their horse out back and shot it a long time ago. When Win2k came out it made a lot of older applications break. Now Vista is about to do the same thing again. The platform itself has been pretty rock solid as well. I can take an old 486dx and put debian on it today. I can also pull out my old Red Hat 5 disks and run linux without a problem.
Ben also wrote – “Open source software cannot seem to gather the resources needed to compete with profit-driven software – at least on the desktop.”
I will chalk this up to a mistype. I am sure you meant at “least on the [enterprise level].” Linux on the desktop is more then ready for prime time. If not for Apple, my mom would be running Ubuntu at home. She is in no way computer savy, but even she will tell you it is just as capible but much easier to use.
I will forgive your ignorance on the GIMP and just say that 90% of what I do at work with images could be done with the GIMP, and 100% of what I do at home could be as well. At one time it was 100% both and that is what I used. Then I started doing print ready and pre-press setups and that is the major area the gimp falls short.
Ben Wrote – â€œWindows strongest point has always been its platform consistency…”
If I take a deb package from Ubuntu 6.10 it has a reasonable chance at failing to operate correctly under Ubuntu 6.04 – and this is a relatively small change. What if I took that deb and installed it on Debian Etch? What if I took a Fedora RPM and installed it on OpenSuSE 10.2? The packaging formats are very dependent on an exact platform to install to. Maybe if I compiled from a source package I would have better results, but why doesn’t the packaging take all of this into consideration? If I have a 32-bit exe file, I can install it with much confidence on a Windows 98 machine, or a Windows XP machine. The API that this code interacts with seems to be consistant (1) from machine to machine (2) from version to version. I don’t see these results with Linux, and I am hoping that this is what this draft will in part address.
LOL. Nice back peddle and partial quoting. If I took the binary of vi from Ubuntu whatever and put it on a Redhat 5 box, yes it will run. Also, if I take the installer file from nvidia and run it on any of them, yes it will work once again. If I take my Savage CDs and go to install it, I can do it on Redhat, Debian, Suse, Ubuntu, whatever. Same with Wolfenstien, Quake, and several other apps including open office and VMWare.
You seem to have a fundamental misunderstanding of package management outside of Windows. First, your question as to why you can’t install a package from an earlier release of a distribution and install it in a newer release. Actually, you can; provided you have all the packages dependencies installed. If the dependencies are newer than those required by the package, you most likely won’t have any problems. If there was a significant functional change in the dependencies, your older package may not work. If you decide to go in reverse (take a newer package to an older release) you’re going to have problems. It is extremely likely the packages the one you are trying to install depends on will be too old. If you force the install then any problems are your own fault.
Second, why can’t you take a deb from Ubuntu and install it in Debian? Simple. Ubuntu is not Debian. Never confuse the two. Why can’t you install a Fedora RPM on openSUSE? Same reason. Each distribution organizes the file system in their own way. The LSB offers some guidelines, but it doesn’t say “you have to install GNOME to /usr/”. Debian installs GNOME to /usr. openSUSE installs GNOME to /opt/gnome. A package is basically an archive with some extra data (dependency data for the most part). A package manager merely unpacks the archive to a root location and updates a package database.
Sure, you could extract some distribution’s package on another. You could then play the game of moving the files to the right locations for your distribution. Then you get to play the dependency game. And if the package depends on an explicit version of libc, and your version of libc isn’t quite the same, good luck dealing with that nightmare.
In short, packages for specific distributions are a _good thing_. If they are built and distributed by the distribution builder(s), then you are just about guaranteed they will work correctly. If some sort of ISV wants to support a specific distribution, goodie for them. If they would actually get on the bandwagon, it wouldn’t matter what distribution they officially support. Distributions could create their own packages.
Also, just because you have an executable that works with Windows 98 does _not_ mean it will work with Windows XP. There are significant differences in those two platforms. There is a reason why Microsoft included the “Windows 98 compatibility mode” option in the shortcut properties. There are plenty of applications that won’t even work with that.
There are installers for Linux that work like installers for Windows. I can think of at least three off the top of my head. Why aren’t they used? They are a poor way to install software. If you sit down and right out a list of pros and cons for Linux software management versus Windows software management, especially in the server environment, I think you will find the list is much longer for “pros” on the Linux side.
Oh, and an ebuild isn’t a package. It is a waste of time.
The situations that concern me are the ones where the package manager reports all dependencies as satisfied – without the user forcing any options. If I take a package built for Ubuntu 5.10 and attempt to install it on Ubuntu 6.10 (going forward) I have no confidence that the utility will run correctly, even when the installation is reported successful. Perhaps dependencies that ship with that distribution were excluded from the deb control, and were absent in future distributions, thus causing failure. Perhaps its just poor community packaging.
I understand why a cross-distribution scenario is likely to fail, but the LSB is proposing that this SHOULD be the responsibility of package management. This API would abstract the filesystem layer (and anything else) in order to resolve these issues. How is a user to know which .deb is for which distribution? – The packaging looks identical. Wouldn’t it be something to have a single package that installs on ANY conforming distribution?
It seems that if the biggest distributions are embracing this change it means they feel something is left to be desired with the current implementation. I agree that automatic dependency resolution, and other features make package management on Linux light years ahead of Windows, but that doesn’t mean there isn’t room for improvement.
Great discussion btw, I miss this…
I see the same problem that james sees. I think that from a usage aspect the current way of installing packages is fine. Distribution specific. The only way I see that they can do this with out restructuring every distribution. Is if they had an installer than had options for what distro you wanted to install. However I again see a problem with this because new distros come out and they would not be able to update every package every time. If the can pull the API off it would be interesting to see what happens.
If you install an Ubuntu 5.10 package cleanly in Ubuntu 6.10, it is highly likely that the application will work correctly. The only thing that would cause the program to not run is if it were built against a different ABI (see http://en.wikipedia.org/wiki/Application_Binary_Interface). When a package installs cleanly, that is without warnings or errors, then all required dependencies have been met; in other words, the dependencies are present on the system at the time of manual installation of said older package.
The LSB does _not_ guarantee that a package from one distribution will install cleanly in another distribution without modification. See http://en.wikipedia.org/wiki/Linux_Standard_Base . If you want to install a Red Hat RPM on Debian, you can do that. You just have to convert the package with alien first. The LSB’s goal is to assure certain fundamental OS aspects are uniform across distributions. This means the core file system (that doesn’t include where optional programs are to be installed), core utilities (programs), and API. In other words, ifconfig should be under /sbin and perl should be under /usr/bin.
From http://refspecs.freestandards.org/LSB_3.1.0/LSB-Core-generic/LSB-Core-generic/swinstall.html :
“Applications shall either be packaged in the RPM packaging format as defined in this specification, or supply an installer which is LSB conforming (for example, calls LSB commands and utilities).
Note: Supplying an RPM format package is encouraged because it makes systems easier to manage. This specification does not require the implementation to use RPM as the package manager; it only specifies the format of the package file.”
That doesn’t mean the contents of the RPM archive should be uniform. It means that a distribution should either use the RPM format (which describes dependencies, et al.) for its packages or provide a way to install such packages (alien).
Paraphrased from Ian Murdock’s blog:
“…every Linux specific thing [ISVs] have to support costs money…
How do ISVs handle this today? For the most part, they ignore the package systems on Linux and do their own thing. Trouble is, while doing their own thing gives ISVs the flexibility to work cross platform, it ultimately makes their products integrate poorly in the broader systems management context because the package systems know nothing about them…
..ISVs want to treat Linux as a single platform, which means they want to offer a single package for Linux, much as they do for Windows.
…an installer just needs to be able to query the system to see if itâ€™s LSB compliant, and if it is, what version of the LSB itâ€™s compliant with; and it needs to be able to â€œregisterâ€ with the package system, so the package system knows about it, including what files it has installed.”
What is being proposed is a package system abstraction. This new 4.0 spec addresses the limitations from your quotes from spec 3.1. It seems that the spec doesn’t go as far as I thought it might (filesystem abstraction), but it should definitely make things easier. What is more is that now that the ISV software is talking to the package management systems, dependency resolution could be simple – even with a tarball.
From your link:
“So, if your â€œsolutionâ€ is to tell ISVs (independent software vendors) to give us their source code so the distributions can include it because thatâ€™s just how we do things, you can safely skip the rest of the post below. Youâ€™re simply not going to agree that any of this is a problem.”
I fall into that camp. Albeit, with some exception. I don’t particularly care if a company like nVidia distributes their own installer. It just depends on the quality of installer they produce. nVidia’s, like their drivers, is quite good. ATi’s, again, like their drivers, is quite a pile of shit. If a company is going to do a good job of producing an installer, I’m fine with it and don’t give a damn if it is recognized by my package manager. If the installer is going to be complete shit, then make the source code available, or even make the binaries available under a license that allows redistribution (I believe, and don’t quote me on this, both nVidia and ATi do this), so that distributions can include it in their package repositories.
I haven’t read the rest of that post yet (Ian, the founder of my favorite distro, says I don’t have to 🙂 ), but I can go ahead and make one statement that I believe will hold true. Just because a distro is LSB compliant, and Debian is one of those, doesn’t mean a package created by an ISV for an LSB compliant system is going to work. The simple fact that said ISV can’t be bothered to take the time to learn what it is they are creating a package for implies there is a great chance of the package not working correctly or even completely hosing a system (yay! for runons).
I use Debian on my servers. I don’t want some third party telling them how to create their packages. They invented the package management system that almost everyone emulates in some way. It’s been working for at least a decade. There is no need to “fix it”. Arch Linux is what I use on my desktop. Their package system is modeled after the Debian system. I also don’t want a third party telling them how to create packages. Pretty much for the same reason. But then, Arch isn’t LSB compliant *grin*.