On Linux Platforms

One of the major differences between Linux distributions, and other Operating Systems (both Free and non-Free) is that Linux often tries to give you everything from one source. Want a piece of third party software? You’re expected to get it (and its dependencies) into the distribution, and install that version(s). Other Operating Systems provide a base platform upon which third party tools, libraries, and applications can be installed into a separate location. This is close to the original intention of /opt, but it’s actually used rather than shunned is if it were some kind of bad idea to want to do this, and it allows one version of the basic OS to live for a number of years independently of any or all of the applications installed.

Unlike many distro folks and Linux enthusiasts, I actually prefer the idea of providing a basic, stable, unchanging platform upon which self-contained applications can be installed. Kinda like “Enterprise” Linux, but different – Enterprise Linux distributions basically snapshot a particular set of distro software and treat that like a “platform”, while their upstream sources don’t. In my perfect utopia, there’s a huge, bright line between basic OS components and everything else. I want a stable OS, but I might want to install a more recent web browser, or some engineering design tool that is more recent from my OS, and I want to be able to do that trivially and independently of the OS. I don’t want it installed in /usr/bin. I want my OS-supplied core junk to go in there, but I want my applications to live separately. Some experimental distros have even tried this stroke of sanity by cloning the OS X /Applications type of behavior, but only experimentally.

In my perfect world, I would get “Fedora” from the Fedora Project, I would install it, and I would get a basic environment including a desktop. It might even include a web browser, but it would not include all of the other stuff. Instead, this would be installed into completely separate directory structures, and be fully self-contained, away from the basic OS environment. It might be that some of it would come with the distro, and it might even be that some of it were packaged and distributed using distro tools, but it would be trivial to upgrade any software independently of the base OS platform because it would still be stored separately from core system components. Try installing a different version of Firefox, or some other system-supplied app on your favorite Linux distribution without having to place it into a separate directory, avoid using actual packaging, or butcher the distro config.

One day, what I want is going to happen. There will be a realization in the wider Linux community that consumers want a basic platform and that they want to be able to treat other pieces of non-core junk independently of that. But this realization (in the Linux space) is still several years away, and it comes after more people realize the benefit of having a computer that just works without the need for hacking or updating or messing around with OS pieces to get there.

Jon.

13 Responses to “On Linux Platforms”

  1. But I have third-party apps that live in their own directories (either because they came prebuilt or because I didn’t want to install them “properly” under /usr/local. Those go under /opt on my machine. As of this writing, the Gargoyle IF interpreter and the Sun Wireless Toolkit reside there, and so did Firefox 4 while I tried it out.

    Now, having the main OS repos separated from the 3rd-party app repos, as in *BSD, would make more sense indeed, and would allow much longer lifecycles for a distro. But what exactly *is* the main OS wrt Linux? Even the basic userland tools — shell and utilities — come from a different project. It’s just not the same model.

  2. Dave says:

    yeah I recently realised we’ve are still only one step above Slackware, i.e. where we were 15 years ago wrt to distros. We’ve got a package manager and for some reason have stalled at the whole providing a Platform.

    Also see Jon McCann’s GNOME OS slides
    http://blogs.gnome.org/mccann/tag/gnome-os/

    which is sorta the same idea about defining a platform and separating application space.

  3. Zygmunt Krynicki says:

    It’s very tempting but there is one quite nontrivial problem to solve. In other OSes that are a platform somebody gets to say what the platform is (which set of libraries and their version). In our world ownership of “key pieces” is spread around several dozen entities. Even if fedora or ubuntu were to freeze them at a particular version it would still not be a single strong platform.

    I think that it might work but we’d have to draw the platform line in a way that the base OS comes with a large amount of private frameworks (essentially the things we package to build a compelling user experience but don’t want/cannot offer as stable API/ABIs) that other software should not link against.

    Well there is also a mindset problem. A lot of people do not see any issues with the current model. Both models have downsides but IMHO you are right to point out that non-developers want a different model for obvious reasons.

  4. DDD says:

    It would be a really good start to improve the “third-party” repository ui in yum. Imagine that you were trying to sell Office or Photoshop. You still want to provide installation and updates in the usual way, but you don’t want to mess with Fedora repos.

    …and people complain about Apple!

  5. Julian Aloofi says:

    Completely agree with you, I’d love to see that happening. Both openSUSE and Ubuntu are making steps into that direction, with the BuildService/PPAs, which still use the “package” mechanism etc, but at least allow up-to-date packages from the original authors.
    It would solve massive packaging duplicate effort, release responsibility and work from the distributions and allow some neat things like multiple application versions easily installed in parallel.

  6. jcm says:

    I agree with you (obviously), except I don’t think the “GNOME OS” is the answer. The GNOME project is doing its best these days to convince previously die-hard fans like myself to run screaming to another desktop, so I’m not listening to their “solution” to this problem.

  7. jcm says:

    Oh, and the nice thing about blogging is that I know I could make proposals along the lines of this post until the cows come home and certain folks just wouldn’t get it (so not worth trying), but I can at least espouse my opinion here without lulling myself into thinking anything will actually change.

  8. Kevin Kofler says:

    I don’t think this is a good idea at all. I find it to be actually much EASIER to install software under GNU/Linux than under proprietary alternatives: just fire up KPackageKit or gnome-packagekit, enter the app name or select it from categories (the latest KPackageKit now fully supports comps categories including subcategories), and a couple clicks later, voilà, there’s your package. It can’t really get any easier: no need to track down the upstream web page, its download area and the correct download for your OS and CPU architecture; much reduced risk of viruses or other malware hidden in your download etc. And the software is automatically updated along with the rest of your distribution, you don’t have to update each program on its own, nor do you have multiple auto-updaters fighting for your network and CPU bandwidth at every system startup.

    And if people really want to provide packages elsewhere, they can easily do this with a third-party repository, e.g. you can get current versions of Firefox from blog.famillecollet.com. I also have a repository up at repo.calcforge.org. Just install the *-release package for your distribution and you can install and update any software in the repository just like the distribution-provided packages. And the system guarantees that the packages are built for YOUR distribution, not some random one which may or may not be compatible.

    The nice thing about Free Software is that it can be easily packaged for your distribution and put into a central repository. A common cross-distro base platform would only help proprietary software, which is IMHO very counterproductive. For Free Software, the limitations imposed by stable ABIs just mean they cannot rely on shared components for many things (especially new APIs) and have to either bundle their own copies of everything (yuck!) or give up about the whole “stable ABI” model (which is what happens now: there’s the LSB, but almost no Free Software project targets that).

  9. Kevin Kofler says:

    PS: The real solution for implementing your wish of having current applications on a stable platform is to have less restrictive update policies for stable distribution releases, i.e. the new Fedora “stable update vision” needs to be overturned. (I actually think what users really want is actually not a completely unchanging base, they just want changes like that libata migration in the kernel which changed HDD device names 3-4 years ago or the big move from KDE 3 to 4 withheld, but they DO want new improved kernels, 3D acceleration support for their new graphics card etc.)

  10. jcm says:

    Kevin, you and I couldn’t disagree more :) Having a “free for all” approach to updates works great if software doesn’t have bugs, works perfectly, and everyone is willing to upgrade daily, weekly, or whatever. Meanwhile, in the real world, there are people like myself who want a computer that just works, against which software can be shipped and updated without it having to be in-distro and for me to be running rawhide for my system to be “usable”.

    But I don’t foresee us agreeing. Your approach to updates and mine is so fundamentally opposite. I think you are wrong, and you think I am wrong.

  11. yeah, Kevin’s right. The repository approach is one of the main strengths of the Linux distro model. I absolutely do not want to have to rely on a hundred different third party repos for my software; I want a single project’s throat to choke.

  12. jcm says:

    Yes, and this is why Linux will never be as “mainstream” as other consumer Operating Systems. The answer is to empower third parties to ship software and tools, not require them to submit to the whims and processes of the Linux distribution. As I said, “Enterprise” Linux distros kinda fix this problem by working around the flaws of the distro-centric model in providing a stable target that’s not moving all the time, but it’s not a perfect solution (yet).

  13. Andrew Clayton says:

    Hmm, so currently if you have 10 packages installed that use libfoo, then you only have one copy of libfoo that they all use. Only one copy to worry about updating for bug/security fix’s. Only one copy taking up RAM.

    In your world, are you saying that each app would bring a long its own copy of libfoo?

Leave a Reply