I’m far from what you might call a die-hard anything when it comes to operating systems. Growing up, I used a mix of DOS and OS/2. OS/2, if you don’t recall it, was the betamax of OSes (or was that BeOS?). Originally conceived as a collaboration between IBM and Microsoft, it wound up as a bastard stepchild beloved by neither. But it was better, dammit, so my dad bought it and used it.

In the late 90s and early aughts I used Windows 95 and 98, and NT 4. It was crap, but it was ubiquitous crap, and it was what everything supported. I was fairly productive under those OSes; I still have some of the more obscure shortcuts ingrained in my muscle memory, like Win-Break to bring up System Settings. The subject I would later come to know as “package management” was a constant headache, however. Every piece of third-party software had to be updated in its own way, and then there was DLL Hell to worry about. God help you if you had just read an article about the Data Display Debugger and wanted a fully functional *NIX userspace to try it out on.

In the early to mid aughts I experimented more and more with various flavors of Linux. At first, I experienced similar problems managing the software installed on my system. Anything off the beaten path would trigger a search through the hit-or-miss listings of RPMFind, looking for that one missing package. It was a mess.

Finally I came around and started using Debian-based systems, and that’s where things got interesting. Sure, the Linux GUI situation was still shaky at best. But for the first time, for pretty much any new utility I read about on Slashdot, I could type “apt-get install cowsay” (or whatever) and a minute later it would be installed and configured. Not only that, but the next time I did a “apt-get update; apt-get upgrade” all those packages would be upgraded to the latest stable versions at the same time. This was pretty neat. But it got better: to set get up and running with MySQL, you did this:

apt-get install mysql-server

Then you got a cup of coffee, and when you came back, the server was already up and running with some sensible default settings. This was downright magical.

All this time I’d also been keeping an eye on the developments coming out of Apple after Steve Jobs’ return. I got more and more excited about what I saw of OS X. A beautiful, functional UI over a solid BSD core? Count me in! I came to the mental resolution that once finances permitted, I’d gradually switch over to a Mac-centric household.

In 2007, I managed to convince the company I was working for to give me a MacBook to work on. I loved that thing. Gorgeous, powerful, portable, and a pleasure to use. It was everything I’d looked forward to.

Well, mostly. There was still that niggling issue of package management. When I came in Mac developers were already on the second wave of package UNIX-side package management solutions. Fink, a re-purposed apt-get system, was on the way out, and MacPorts, a more BSDish solution, was the new hotness.

MacPorts worked well enough if you stayed on the golden path of a couple dozen well-maintained packages, and you didn’t need any unusual compile options. Unfortunately, I’m the kind of polyglot hacker who will randomly decide that today is the day to play with XOTCL. MacPorts had a somewhat limited selection compared to APT; but even within those boundaries all was not well. I had a lot of MacPorts installed. And like a good admin, I made an effort to keep the packages updated. Unfortunately, more often than not, this meant that after leaving the system to compile updated packages for 24 hours, I’d come back to find my scrollback buffer had overflowed with compilation errors stemming from badly- or un-maintained packages. Eventually I just gave up on updating.

Meanwhile Ubuntu was maturing. By the end of my ~2 year stint with a Mac, I was doing most of my work inside of Ubuntu VMs running under Parallels.

It wasn’t just that the Ubuntu package archive contained everything I ever wanted, and the packages invariably Just Worked when I installed them, and system updates were never a problem. It went beyond that. All Debian-based systems have this thing called Policy. It’s kind of like Leviticus for package maintainers. It specifies finicky little things like naming conventions for packages. And where the code examples must be installed if your package comes with examples. And how your startup script must behave if your package contains a daemon. And where config files go, and where the example config files went. And how all lockfiles, pipes, and pidfiles mustbe placed in the appropriate namespaced paths under /var. Etc., etc., etc..

When you spend more than a little time dicking around with assorted software packages from assorted sources, this stuff goes from being amusingly anal-retentive to being mana from heaven. You know where to find the things you are looking for, every time, without fail. And you know that installing obscure package foobar is never going to fuck over some existing package the way they sometimes would in MacPorts.

So my decision to give up on the OS X dream and resign myself to a life of cursing at crappy graphics card drivers was, in the end, driven by zero idealism and the purist pragmatism. It was about getting shit done, and not having shit break on me every few weeks. Because when I look back over my OS career, I realize I’ve spent far more down-time un-fucking b0rked packages than I ever did trying to get Linux to talk to my wifi card. And the thing is, once I figure out the magic incantation to make the wireless NIC work, with rare exceptions it keeps working from there on out. The same cannot be said of software packages on any system I’ve used other than Debian.

I’m not going to tell anyone what they should use. By all means, use what works for you. I’m sometimes surprised, though, at the vehemence with which people will assert that my choices can’t be working out for me as well as theirs. Or at the misconception that I must use it out of some kind of holy Free Software crusade or need to be l33t-er than thou.

Does Ubuntu have more than its share of usability warts? Yep. Does its hardware support drive me to distraction from time to time? Absolutely. But overall, has it been a more reliable swiss-army chainsaw for the day-to-day hacking problems that I confront? Without a doubt.

For what it’s worth, I still peer over the white-picket fence from time to time to see how the Apple neighbors are doing. I see they’ve moved on to a third package management solution. What I see sometimes brings back memories.

EDIT: Because I’m tired of answering the same objection over and over again, a clarification. If you didn’t click, or didn’t understand that last link, I have discussed HomeBrew with a number of Mac users and they confirm that the very same issues I had with MacPorts–missing or broken packages when you step off the “golden path”–are present in HomeBrew. If HomeBrew works for you, great! But understand that there are people like me who tend to go beyond the relatively tiny set of well-supported paclages available in HomeBrew. For me, it’s a headache I haven’t had to deal with for a long time, and one I don’t feel any pressing need to deal with again.

Oh and if I never wait for another 8-hour system update again it’ll be too soon. Precompiled packages are where it’s at.

Enhanced by Zemanta

Published by Avdi Grimm

40 Comments

  1. debian-policy certainly was a sea change in Linux maintainability. I’ve used Ubuntu on and off, but I’ve soured on it over the years. 

    Reply
    • Yeah, policy is great. Periodically I see someone switch to Arch and I go and read their pitch and I see “pristine packages” touted. And I think “no! pristine packages are the opposite of what I want! I want packages that have been brutally hammered into submission by an angry Debian maintainer!”

      Reply
  2. I haven’t had the same varied history of OSes as you (Win 3.11 -> 95 -> 98 -> XP -> 7 -> Ubuntu -> Arch Linux -> Ubuntu) but I pretty much agree with everything you say here. I’m having terrible problems with my graphics card on my shitty laptop coupled with external monitor, but I still absolutely love Ubuntu. It would take a lot to move me to a Mac (apart from anything else, it’s all so damn expensive!).

    If anyone every asks why I use Linux, I’ll point them straight here 🙂

    Reply
  3. debian-policy certainly was a sea change in Linux maintainability. I’ve used Ubuntu on and off, but I’ve soured on it over the years. Lately, I’ve gone back to plain ‘ol Debian, particularly for my Sun boxen (Ubuntu doesn’t like non-x86 hardware), and I’m actually using Linux Mint for one of my workstations at the office, and I have to say, I’m lovin’ it. I’m using the Ubuntu-backed Mint distro w/ GNOME 3 ATM, and I like it much better than Natty Narwhal (Unity is horrid).
    At the same time, although I love Linux, I use OS X quite a bit as well… my main workstation is an iMac, my primary laptop is a Macbook Pro. Between Homebrew and RVM, most of the nasty conflicts that plagued us on Macs in the past are gone. And, even on Linux, it can take some tweaking to get the system Ruby and RVM playing nicely. So there are always tradeoffs.

    Reply
    • “Between homebrew and RVM” -> “Between a package manager made by and for Ruby hackers and a Ruby-specific version manager tool”. I love RVM, and I hear Homebrew is great, but don’t forget that you’re talking about tools that service a pretty small programming ecosystem.

      Reply
  4. I use Ubuntu for similar reasons.
    Natty is driving me up a wall ATM because of wifi problems on my netbook, but Ubuntu’s accrued enough credit with me from previous releases that I’m just waiting for the next release, (which I know will be in November.)

    Have to say that I still LOL a bit when my dad starts bitching about doing updates on his various Windows boxes.  Yeah I do updates, it usually takes about 3 minutes or less and only rarely takes a reboot.  My machine just runs, things work and I get back to work.

    Reply
    • What cracks me up is that rather than improving over the years, now every Windows program comes with its own updater daemon that lives in the systray and pops up every 15 seconds yelling at you about the updates that are available.

      Reply
  5. I accept your “Debian packages > MacPorts” and raise you a “Homebrew > Debian > MacPorts”.

    Not saying you should now ditch Linux for OS X because of Homebrew. But your experience of shit breaking every few weeks is not our experience anymore — quite the contrary.

    Reply
    • Agreed. I once ditched OS X for Ubuntu/Arch Linux because of package management, but I came back to OS X because of everything else. Homebrew may not be as good as Pacman, but it gets the job done, and that is all I need.

      Reply
    • You didn’t click through the last link, did you 😉

      I’ve been paying attention to friends experiences with homebrew, and it tracks pretty closely with my MacPorts experience. Limited selection, sometimes you need to fiddle with compile flags, and if you step off the beaten path things break. Oh and most people say “what? it never breaks!” and it turns out those people all use the same couple dozen packages and nothing else.

      Also, last I checked it wanted to stomp all over /usr/local, which is just nuts.

      Reply
      • I usually don’t click on links 🙂

        Yeah, I think I only install few dozen formulae myself, but those work perfectly. I like it how it doesn’t stomp outside of its directory, which can be anything else besides “/usr/local” (I kept it in “/opt/local” for a year).

        I like it most because I can hack on it: https://github.com/mxcl/homebrew/commits/master?author=mislav

        I can read and edit formulae. It can install software from precompiled binaries, from tarballs, from a git repo. I can add new Homebrew commands by adding ruby scripts to the PATH.

        Contrast that to Debian packages and package managers. Yeah, they’re open source, but they’re not as easily approachable and hackable. And good luck maintaining your own package and getting it included in the distro.

        Reply
        • I don’t want to write install scripts myself. I just want to run them. This works on Ubuntu. My experience with Mac so far as been poor in this regard – either it’s supported and works, or you’re likely to give up on it.

          Reply
  6. I switched to Linux (openSuSe) in 2007 to avoid the certainty of a Vista-related suicide.  I had some problems then with my graphics card (ATI.. grr) but I worked through it and fell in love with workspaces,  package management, and pretty much everything else.

    In early 2008, I bought a new laptop and had the great idea of buying one to use with Linux – a Lenovo r61 with nvidia graphics card and intel wifi – all things that were proven to work well with Linux.  The idea of having your OS choice drive your hardware choice can be annoying, but I’ve found it has solved all the issues I once had (and really, it’s still a wider hardware selection than there is for OS X).

    Reply
  7. I totally agree with you.  While I work both on Mac and Ubuntu, Ubuntu is highly preferred.  I only want to compile a dependency if absolutely necessary, and I don’t want to download a 4GB development environment to get a C/C++ compiler (recent developments have hopefully made this less frustrating).  If I’m waiting for my supporting software to download/compile/install, it’s time wasted.

    I strongly prefer Ubuntu for Java development as well.

    Reply
  8. Great phrase: reliable swiss-army chainsaw 🙂

    Reply
  9. Us Mac users still have a little chuckle when random video card driver update causes you to lose a whole morning of productivity trying to figure out exactly what has happened. 

    Homebrew has come a long way. It is actually pretty functional these days. These days, you can pretty much chose what you want. You can be productive on Linux or the Mac OS X. 

    These statements come from someone who has been using Linux for quite a bit longer than you and most people reading this post.

    Reply
    • As I tried to make clear, this post is a very personal statement. YMMV. To date, I, personally, have lost more time to broken packages than I’ve ever lost to hardware issues; so for me Debian-based Linux has been a more productive environment.

      It’s worth noting, too, that the hardware downtimes are easier to schedule. I’ve rarely had hardware support breakage as a result of an ordinary system update. They almost always happen during upgrades to the next version of the OS. So I know to schedule system upgrades on weekends, and I don’t lose work hours.

      (Incidentally, does anyone NOT do this? Most of the Mac devs I know waited until the weekend to install Lion)

      General package updates happen a lot more often–sometimes they are even driven by development needs, e.g. I hear that a new release fixes an issue I’m seeing in the app, so I do an update, and suddenly something else breaks. That’s the situation that most often killed my productivity on the job.

      Reply
    • However, Linux supports thousands of different video cards and other devices whereas OSX only need to handle the few that Apple sells.  That said, it should be REALLY sad if OSX chokes on Apple hardware.  Linux does a pretty good job of handling most custom PC configurations from many different vendors.

      Reply
    • “random video card driver update causes you to lose a whole morning of productivity” hasn’t happened to me in years. Moreover, I haven’t had to download and install a custom video driver for the last two years – on Ubuntu.

      “Homebrew has come a long way.” IMO you don’t get it. It’s not the package manager itself that makes the difference. Even if apt was crappy when compared to homebrew (which I don’t think it is), it’s the long list of properly managed .deb packages that makes the difference. It’s around 40K packages in ubuntu’s repos.  Homebrew provides less than 2K MacPorts less than 10K.

      Reply
  10. A just and honest view on Linux. I like “I’m not going to tell anyone what they should use. By all means, use what works for you.”. Yeah, use what works for you, that is why I have been using Ubuntu for five years.

    Reply
  11. You are another dude that hates the Microsoft’s tech. Linux, OS X, Android and more OSes are just only a flavors that we customers could taste… I like Linux in all presentations (some are a really big crap!), UNIX in all versions, OS X in all versions… I mean, every platform have some stuff that are so cool, I’m a developer and i love see how all those OSes have a interoperability thin lines among them. That’s very important because all platforms are so very powerfull, that to give us the posibility to create tools and use them on every place we want. Now we shall know what the Cluod Computing means…

    Reply
  12. Dear Avdi,  Re:  About your “policy” remark,  could you write an essay explainin in detail
    all the things a programmer (like me) should do to honor “policy”  (and why)
    and how an unintroduced person can build a “package”.  I’d also like to know
    the state of packaging systems – is apt-get the tool for everyone now?
    I bet others would want to read that.  – gary knott,  garyknott@gmail:disqus .com
    (P.S. See http://www.civilized.com/programming.html )

    Reply
    • If  you click through his link you can read the Debian policy manual.  It’s fairly short and approachable (if a bit more Debian-tools-specific than you might want).

      Reply
  13. Interesting
    post. What approach do you usually take when you want a package that’s
    more recent than what your OS version provides? Do you compile from
    source, use a PPA, or use a package designed for a newer OS?

    I asked two questions on Ask Ubuntu, (
    http://askubuntu.com/questions/55572/how-do-i-decide-which-ppa-to-use
    and
    http://askubuntu.com/questions/55821/use-a-ppa-or-use-a-package-for-a-different-ubuntu-version
    ) but the answers weren’t awfully helpful.

    I know there’s RVM … but I want something less bleeding-edge than that for work.

     

    Reply
    • Ubuntu Natty is already at 1.9.2 in the ruby1.9.1 package, and will be going to 1.9.2.290 (or newer) in October.  If you need actual bleeding edge and there happens to be a PPA, fine, but bleeding edge is really “here be dragons”.  You may want to ask yourself if you really need that.

      When you do, compiling from source temporarily is an option.  If there’s a newer OS with a newer version you may consider just upgrading (this comes up less with pure Debian than with Ubuntu)

      Reply
  14. Yup, and I’d like to add:

    1. The existence of Linux has made Windoze and Mac OS X better than they would have otherwise been.

    2. I hate the current package differences and installation/update
    inconsistencies amongst the distros; Ubuntu is king for now, but …. “someone” in the Linux community needs to Posix-ize package format/management in Linux.

    3. I’m a KDE guy (Gnome is too whimpy for me albeit KDE has too many “knobs” to twist). Unfortunately , Linux lost its chance to take on more of the desktop (due to the
    usual no one really the Captain of the Linux “ship”); so where it goes
    from here is anybody’s guess/speculation/punditry.

    4. While it’s not surprising there’s so much “chaos” in “free software”
    (albeit the money IBM et all put in to it), at least for the most part
    that chaos produces a pretty reliable OS experience.

    Reply
    • I used to think that getting the formats/tools standardised on would help a lot, I’ve come to realise that Debian could run on yum/RPM and still rock, and RHEL based stuff could run on APT/deb and still suck.  It’s the policies and the tools and the dedicated community that make Debian rock.

      Reply
    • I used to think that getting the formats/tools standardised on would help a lot, I’ve come to realise that Debian could run on yum/RPM and still rock, and RHEL based stuff could run on APT/deb and still suck.  It’s the policies and the tools and the dedicated community that make Debian rock.

      Reply
    • I used to think that getting the formats/tools standardised on would help a lot, I’ve come to realise that Debian could run on yum/RPM and still rock, and RHEL based stuff could run on APT/deb and still suck.  It’s the policies and the tools and the dedicated community that make Debian rock.

      Reply
  15. I know this probably comes up a lot, but one thing to keep in mind with the “hardware support sucks” problems is that OSX actually has the worst hardware support of almost any OS.  It’s just that you almost never (unless you’re Hackintoshing) use it out of an Apple-constructed hardware bubble.  We also have such bubbles available for the Debian-based world: ZaReason and System76 being some particularly good examples of OEMs that make sure the hardware works so you don’thave to futz 🙂

    Reply
  16. I know this probably comes up a lot, but one thing to keep in mind with the “hardware support sucks” problems is that OSX actually has the worst hardware support of almost any OS.  It’s just that you almost never (unless you’re Hackintoshing) use it out of an Apple-constructed hardware bubble.  We also have such bubbles available for the Debian-based world: ZaReason and System76 being some particularly good examples of OEMs that make sure the hardware works so you don’thave to futz 🙂

    Reply
  17. Avdi – I’d be very interested to know what hardware you are using Ubuntu on? Do you run it on your MacBook?

    Reply
  18. Would agree with every word, having similar experience and getting back to linux every time I need something “reliable and with understandable problems and ways to debug them”.
    And we are in 2013 now, running ubuntu on Zenbook U32, after using mcb air for almost a year.

    Reply
  19. Great writeup. I’ve had many similar experiences and thoughts. I wrote up a response: http://www.benjaminoakes.com/2013/12/04/my-thoughts-on-avdi-grimms-why-linux/

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *