“Linux distribution”

At the plumber’s conference right now, just finished watching Arjan and Auke’s 5 second boot time talk. Very very impressive talk.

They drove a lot of changes into the entire stack, from how the kernel works to fixing HAL and X.org bugs, filesystem tweaks in order to get to 5 seconds. Of course they cut out some things too – since the netbook targets are mobile single user machines, they don’t use GDM, I’m pretty sure they aren’t starting CUPS, etc.

But this talk led me to only feel more strongly about something that I’ve been feeling increasingly more lately – that we have been far too complacent about accepting “Linux distribution” as a concept. This is the idea that you can take the Linux kernel and a variety of tarballs found on the Internet, compile them and put it in some variant of tar, and you have an operating system. It turns out it’s not that simple.

To create a compelling system, first, you need to make decisions about how things work and how things fit together. The “Linux distribution” mindset is an excuse for having different vendors make different decisions and we end up sharing very little. We need people like Arjan’s team, the Ubuntu team, the Fedora team, the OpenSuSE team to share far more than we do now. I think a lot of vendors (“distributions”) tend to have an insular mindset where they think they are the center of the universe and why doesn’t everyone just join their project anyways?

Five or six years ago I think there was more of a mindset that Linux and the Free Desktop was going to take over from Windows (“year of the Linux desktop” and all that) and each vendor thought they were going to take over the world and consequently were less interested in collaboration. Personally I don’t think we’re going to see that sudden takeover any time soon, but at the same time if we do things right there is a lot of opportunity.

It’s easy to say distributions are bad of course – far harder to actually collaborate better. How do we do that? Here’s an idea:

Share a core OS image built at freedesktop.org

We’re not going to unify all the different packaging tools tomorrow. However, there’s no reason that the core OS build tools and image (kernel, dbus, HAL, X.org) couldn’t be shared. So “distributions” could keep their favorite variant of tar and wget (.deb and apt, rpm and zypper, rpm and yum) for everything outside of the image. In other words you could still do “apt-get install epiphany” or “yum install httpd” whatever other app on top of the image and that would still work in a normal “packaging” way, but the core OS image would be rsync’d outside of that.

Each vendor would still control all of the code – each component of the image could have a “vendor” branch where vendors can add kernel drivers and the like. Image update policy would still be controlled by vendors.

There of of course other things that it’s completely stupid to fork like bug trackers and more importantly the bug reports. We need to do a much better job of sharing bugs in the right place – the upstream project.

In a sentence, take projects like freedesktop.org and gnome.org as more than just tarball sources and turn them into real collaboration points where things get tied together, sharing actual infrastructure. It wouldn’t be easy but really, we need to stop thinking that it’s acceptable to have 5+ major forks of the core OS infrastructure.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s