Ubuntu - The Art of Release
Mark Shuttleworth blogs in The Art of Release about Ubuntus next release dates and the possible synchronization with other Linux distros.
According to some comments on the blog, it seems like Hardy Heron was not well tested and to get it out on its planned release date, QA was sacrificed.
I think, Mr Shuttleworth now recognizes, why all major distributors have several developers in all major software projects. It is a give and take and if Ubuntu only takes, without the expertise or man power to give back, they have no man power to release a "beta" version of one of the major projects. (How many kernel patches are from the Ubuntu kernel team??). I think, to overcome this shortness, he suggested a synchronized release date.
I am proud, how our Fedora Project community managed to get out a good Fedora 9 with nearly the same software.
Here are some remarkable comments to his blog entry:
Why would Red Hat cooperate with Ubuntu, especially now that Ubuntu also has its sights set on the server market. Don’t they consider Ubuntu a threat?
I don’t think Red Hat would see Ubuntu as a threat, we appeal to different audiences.
Have you ever noticed that competing car dealerships, fast food restaurants and other very similar businesses all setup shop next to one another? You get food courts in shopping malls, for example, where you have all the take-away food places in one area. You would think that the competition would be bad for them. But counter-intuitively, all the restaurants do better when they are all in the same place, and the same is true of car dealerships. The phenomenon is called “clustering”, and it works because people first decide to go “looking for a car” and then later decide which car, or which dealership. I think Linux is the same - if we coordinate our releases, we send a very strong message to the outside world that will bring more people to Linux in the first place - making the pot bigger for everyone.
With the release cycle being this fast Ubuntu as well as X.org/Gnome/KDE and other projects are in a constant beta cycle. Quality and quality assurance is always secondary to the developers desire to create new stuff. Of course one can always stick with an old version of any given Linux distro and not upgrade but then one might not get a) new applications or b) new versions of old applications with important fixes. This is because things are always changing underneath applications and application writers naturally adapt their code to that. Thus APIs and ABIs are constantly changing and and in order to get new stuff the user ha to commit to a constant upgrade/reinstall cycle. Personally I absolute hate upgrading. I want to spend my time doing my work, not wasting my time with resolving the problems (there always are some) after an upgrade or reinstall and retweak my settings to my liking. Now contrast this with Windows where the API has stayed the same for years and developers just make do with that. Yet new applications emerge every day. Imho, this instability and constant change is the greatest problem with Linux as a whole.
The bottom line is, you wont get a rock solid system release, until you freeze features and go through several “bug-fix” cycles only. And this is the greatest challenge the Linux/Ubuntu/Open-Source community ever has to face if they ever seriously plan to become anything more than a side note.
You make a very good point about the pace of releases and one’s ability to make fixes. We can of course choose how many new components we bring into a release that is going to be designated LTS, and this is the primary tool we have for allocating time between new-version-integration and stability/fix work. My sense, though, is that the upstreams would respond very positively to a coordinated cue from distributions w.r.t. their long-term support plans. If upstream knows that multiple distributions will be using a particular version, they can respond appropriately.
Ubuntu has been nothing short of revolutionary in the linux world, not in terms of the quality of the software or the packaging, but because of its appearance at a time when the linux world was rapidly expanding and its impeccable community infrastructure and wide-ranging support. Its focus on free, community-focused yet well-supported and functional software certainly created a reference point, spurred competition and opened the door for linux to the Small-Medium Enterprise world.
Yet calling Ubuntu 8.04 an ‘enterprise platform’, with its half-baked, semi-functioning gvfs, its largely unintegrated and unsupported pulseaudio and a beta version of Firefox 3.0 as its default browser is in no way an accurate characterisation.
Moreover, it’s somewhat ironic that the punctuality of the release is employed in this manner to showcase the merits of the project. Yes, the Ubuntu developers, the volunteers, the users, the community as a whole is commendable. But the fact that Ubuntu came out ‘on time’ means nothing when ‘on time’ translates to beta quality software at times inferior to the version it succeeded.
It’s a shame 8.04 was tagged as an LTS release. While a great ‘normal’ release, it’s certainly quite far from the robust, polished and functionally conservative featureset that any ‘enterprise’ release is typically associated with. In fact it’s exactly the opposite; flamboyant, bleeding-edge and somewhat buggy. I’m certain that the decision was conscious and that Canonical is expecting 8.04.1 to remedy all that’s wrong with 8.04. Hopefully it will, yet I cannot but feel that 8.04 LTS should’ve been 8.06 LTS, or at least 8.04 sans-LTS. The fact that it isn’t, coupled with this post introduces a political layer to its development and one that could potentially affect its popularity where the ‘LTS’ label has any meaning.
Mark, I am particularly disappointed at what Ubuntu has become over the last few years. At the very outset, I’d like to say for the record that I’m no GNU/Linux noobie trying to express my frustration here. I’ve been using GNU/Linux since 1998 (the Red Hat era). Experience has taught me never to bother with upgrades and always go for fresh installs. So none of the following problems relate to upgrade issues.
In the beginning there was all this hype and lots of publicity that Ubuntu gained in being the most user-friendly distribution out there. But as time progressed, I began to notice that problems were more commonplace with newer (and quicker) releases. For instance, while Feisty worked well on my Laptop and my desktop, Gutsy started showing signs of beta software in that there were many unexplained terminal crashes, X lockups etc. I was quite confident that Hardy would not only rise above these small problems, but prove to be the best distro out there. Unfortunately this is not the case. Neither did Hardy allow me to install drivers for my USB TV Tuner (that worked well in Gutsy by the way), but it also magically dropped support for suspend/hibernate on my Laptop. The point is that the hardware in question is not *new*. Add to it, it was well supported by the earlier Ubuntu releases.
Instead of merely complaining about things that don’t work for me, I decided to poke into the causes of these problems. To this end, I decided to try out different distributions which can be regarded as being on-par with Ubuntu with respect to software versions. These were Debain (Lenny) and Fedora 9. Even though Debian had a Ubuntu-skeleton feel to it, it turned out to be more stable and all my hardware worked well with it. Granted, I still had to build a few drivers myself. On the other hand, Fedora 9 proved to be even better as *everything* in my Laptop (including my TV Tuner) was detected out of the box (something I expected out of Ubuntu Hardy) and even Suspend/Hibernate works flawlessly (so far). Granted F9 is bleeding edge but it’s not like Ubuntu is running a ageing 2.4.x kernel compared to F9.
I am never shy to experiment with building drivers/kernel modules or even recompiling the Linux kernel. I’ve done these dozens of times on several different distributions. But somehow, every time I tried to support my hardware in this fashion, Hardy kept pissing me off with all kinds of compile errors. It would simply not work with the standard ways of rebuilding kernels and modules. After poking around in the Ubuntu forums, I finally realized that part of the problem stems from Ubuntu’s own unique way of managing things even if it requires violation of Linux Standard Base (LSB) guidelines. This IMO is not a good sign. At the very least, I expect that being based off Debian, Ubuntu should know when and where to stop its modifications/improvements. If you diverge too far from Debain (testing) you risk stability and that is exactly what Hardy is now showing signs of. Part of the problem can be attributed to very short release cycles but that is no excuse, is it? Look at Fedora.
To sum it up, although Ubuntu is doing a great job weaning people from MS Windows, I fear that its success will be short-lived. I’d hate to see that happen to a distro like Ubuntu which showed great promise in the beginning.
More comments can be found on Slashdot, like this one:
If at least three of these can agree to ship the same LSB version at approximately the same time, they won't be doing anything new, but they could gain the benefit of sharing bug reports, sharing device drivers (since a standard kernel would be the focal point for driver development), even sharing management tools (since they all assume the same LSB version), and better support from 3rd party proprietary products like Flash, Oracle, and VMWare (which still hasn't shipped a working version for the Ubuntu LTS kernel). Granted, the last point might cause, FSF devoties to throw fits, but unfortunately some of us wouldn't be able to move to Linux without these products (e.g. I use Oracle at work and my wife needs VMWare to access some windows specific functionality on her bank that crashes KVM and VirtualBox and does not work at all in WINE).