I'm guessing the X in title of this post is standing for "desktop", since that's what the other post on the front page today was referring to. But I think the problems in this article have nothing to do with the mainstream indifference towards Linux on the desktop. (And before I start, I should say I love Linux and have used it as my primary machine for years--and am still using it now.) I think the problems are:
1. The milestone-release system in most big distros. For Ubuntu, the biggest and supposedly most user-friendly distro, I'm expected to upgrade every 6 months. One could argue that if your system is working OK, then you can stick with one release forever--imagine still using 8.04 in 2011. But what if I want Firefox 4, or a new version of a single program? In Windows, you just go to the website and install, or sometimes the program auto-updates itself. On Ubuntu, I must update the entire system, even if I want just one program to update. That means when I update to Firefox 4, there's a chance my wifi will no longer work (happened to me in 9.04), or that hibernate won't work (happened to me in 11.04) or that my desktop environment will be shockingly different for no reason. All I wanted was Firefox--but to get it, I've got to swallow Unity and any other half-baked software the distro throws at me.
My mom ranted at me for 15 minutes because I installed FF4 on her Windows machine and now her "Home" button was on the other side of the address bar and her address bar wasn't on top anymore. Can you imagine if she had been using Ubuntu, clicked "yes" to the upgrade prompt just to get it out of the way, and had been presented with Unity? She would have had a stroke.
Yes, you can install PPA's and through various console voodoo upgrade only certain parts of the system, but not every program has a PPA and installing them is beyond a mere mortal's grasp.
And, even if you decide to skip a 6-month upgrade, at some point you won't have a choice--security updates will stop coming. Good luck upgrading an Ubuntu system with 2 years of upgrades in a row--you're going to have to flatten and reinstall, again something beyond mere mortals.
2. Quality control--and this is tied in with #1. Again going with Ubuntu (but I think this applies to most other distros as well): every time I upgrade, I'm presented with a literal swamp of fresh bugs and regressions for things that used to work. I've been using the same laptop since 8.04, and with each upgrade something that used to work breaks, something that was broken before gets fixed, and I get new bugs to deal with. Sometimes hibernate doesn't work; sometimes wifi; sometimes the boot splash is corrupt; sometimes this, sometimes that. I know quality control is a hard thing to do considering it's all volunteer-powered and we're fighting against propriety lock-in; but there's just no excuse if you're trying to put Linux on the desktop.
If Linux is to beat Windows, it has to be easy to update specific software without updating every damn thing and without regressions. Windows has, more or less, managed to do this. So far Linux hasn't, for whatever reason. Until they do, it'll be relegated to being an enthusiast's OS (and there's nothing wrong with that either).
Most people (meaning the 'general population', 'mainstream users', and 'average Tom, Dick, or Harry') who aren't GNU/Linux users have either:
a) Never heard of G/L
b) Don't know how to install it; don't care; 'Windows/Mac is good enough for me'
c) Have had compatibility problems with it when they tried
Of course, the idea of general population depends on who you interact with the most, but let's assume general means people who go to Best Buy/Costco/Walmart for computers. These people would probably be willing to try G/L if it came by default. Yet the moment some weird error came up that involved anything more than a simple Google search, back to the store the computer goes. This is not only a loss of money for sellers and manufacturers, but also ends up being really bad PR.
G/L distros like gNewsense or Trisquel (both of which run Linux-libre) would run and sell really well at Best Buy if the video cards, wifi adapters, etc. they would come with worked out of the box. Yet God forbid you use some other [wifi adapter, insert other unsupported device] that doesn't have the right firmware! (Of course, this also applies to regular G/L albeit much less so.) Most 'average Joes' don't care about things like FLOSS unless it works and doesn't require a lot of work to setup and use constantly.
That's not to mention running Windows programs that don't have a FLOSS alternative.
My point is this: G/L won't become immensely popular without the major compatibility issues being fixed and it becoming a default install on a whole major line of computers. Compatibility won't be fixed until the vendors of the devices or programs see a significant profit intake from releasing the firmware/whatever and do so. Default installation won't happen until some major hardware company comes along and sells G/L only.
TL;DR: G/L needs major support to become commercially popular.
I'd really like to know what happened to Ubuntu being supported by Dell a while ago. http://www.dell.com/ubuntu only lists one machine that you can buy with Ubuntu pre-installed.
If this had kept up, and more suppliers had joined the bandwagon, it would have been exactly the major support you are talking about. Why did things move backwards?
I think this problem was one of perception. I think too many salespeople were pushing Netbooks as "cheaper smaller computers that can do anything a desktop can except gaming", which to most people translated to "cheap and small windows install that can do simple games". But then they took it home and turned it on and saw something that wasn't windows and they got scared.
For #1, use a Long-Term Support version every 3 years, and, if you like, install an individual software package with its own non-Canonical (ha, pun!) .deb, or a PPA, or just download a package from the publisher.
For #2, a thousand times amen. My project for today is re-partitioning my hard drives to make room for a Windows installation. Wifi-sharing doesn't work anymore (because someone decided that bridging wifi to ethernet adapters was a misfeature?), but still happily advertised in Network Manager.
Suspend and Hibernate don't work (even worse, they lock up the system), and the summer weather is too hot to keep the desktop running 24/7.
Apparently, desktop Linux only works if you want to version-lock on 2-to-5-year-old hardware and features that were designed for that hardware, that took 2-5 years to port to Linux and debug, all the while creeping through the minefield to avoid reressions in stability, functionality, and user experience.
If you can't use modern software, and you can't use the capabilities of modern or bargain hardware, than what's the point? It adds up to a hidden tax because you endure an ongoing lag against progress and wasting money on hardware you can't fully exploit. You may as well buy proprietary OS and software with all the money you can save by buying new-generation Windows-certified hardware, and money can earn by not spending 10 hours a week trying to fix you machine.
Fighting for freedom is noble, but it's a martyr's calling.
It's hard to be so in love with technology, while at the same time never getting to enjoy using it. It's sadomasochism.
This goes for the desktop. On the server, where we have hardware drivers with 10-year lifecyles, and top-to-bottom programmability and network-transparency, Linux is a dream.
I am planning to switch to Windows to get wifi and multimedia to work, and run Linux on a virtual machine inside it, treating the VM like a local low-latency headless Linode/EC2-style instance.
1. The milestone-release system in most big distros. For Ubuntu, the biggest and supposedly most user-friendly distro, I'm expected to upgrade every 6 months. One could argue that if your system is working OK, then you can stick with one release forever--imagine still using 8.04 in 2011. But what if I want Firefox 4, or a new version of a single program? In Windows, you just go to the website and install, or sometimes the program auto-updates itself. On Ubuntu, I must update the entire system, even if I want just one program to update. That means when I update to Firefox 4, there's a chance my wifi will no longer work (happened to me in 9.04), or that hibernate won't work (happened to me in 11.04) or that my desktop environment will be shockingly different for no reason. All I wanted was Firefox--but to get it, I've got to swallow Unity and any other half-baked software the distro throws at me.
My mom ranted at me for 15 minutes because I installed FF4 on her Windows machine and now her "Home" button was on the other side of the address bar and her address bar wasn't on top anymore. Can you imagine if she had been using Ubuntu, clicked "yes" to the upgrade prompt just to get it out of the way, and had been presented with Unity? She would have had a stroke.
Yes, you can install PPA's and through various console voodoo upgrade only certain parts of the system, but not every program has a PPA and installing them is beyond a mere mortal's grasp.
And, even if you decide to skip a 6-month upgrade, at some point you won't have a choice--security updates will stop coming. Good luck upgrading an Ubuntu system with 2 years of upgrades in a row--you're going to have to flatten and reinstall, again something beyond mere mortals.
2. Quality control--and this is tied in with #1. Again going with Ubuntu (but I think this applies to most other distros as well): every time I upgrade, I'm presented with a literal swamp of fresh bugs and regressions for things that used to work. I've been using the same laptop since 8.04, and with each upgrade something that used to work breaks, something that was broken before gets fixed, and I get new bugs to deal with. Sometimes hibernate doesn't work; sometimes wifi; sometimes the boot splash is corrupt; sometimes this, sometimes that. I know quality control is a hard thing to do considering it's all volunteer-powered and we're fighting against propriety lock-in; but there's just no excuse if you're trying to put Linux on the desktop.
If Linux is to beat Windows, it has to be easy to update specific software without updating every damn thing and without regressions. Windows has, more or less, managed to do this. So far Linux hasn't, for whatever reason. Until they do, it'll be relegated to being an enthusiast's OS (and there's nothing wrong with that either).