Despite the ranting title, I actually have a technical description of my problem, and then what went on. This is half for posterity so when I go to upgrade again later I won’t have to curse at my computer and the ubuntu forums, I will just be able to read my own blog.
Upgrading to 11.04
Ubuntu Linux upgrades every 6 months. A few months ago I read that 11.04 was buggy, and I shouldn’t bother upgrading until that settled out. So I waited until last month. Then I upgraded my server in the basement. My server holds media files and backups (now with super TimeMachine powers for backing up Macs), acts as my side-project web development server/code repository host, and records TV for me using MythTV.
Then I have a smaller, lower-powered, but able to play video at 1080p computer hooked to the TV upstairs, that streams TV from the basement server, or from hulu or whatever (except Netflix, but don’t get me started on how stupid it is that they don’t support linux).
So the server upgrade went flawelessly despite running on pretty outdated hardware – I built that machine in 2002, and haven’t done much to it since then.
Then another month goes by, and I decide to upgrade the front-end computer connected to the TV. This is a DISASTER.
As soon as I install the latest NVIDIA driver to be able to watch the video at 1080p and reboot, I get a blank screen. I try tons of stuff – tutorials, troubleshooting guides (this is apparently a pretty common problem), I even upgrade to the beta release of the upcoming version in October – Nothing.
So after days and days of spending time on this, and not being able to watch SYTCD or Dr. Who for a couple weeks, I give up and re-install ubuntu 10.10, which works flawlessly.
What do I think is going on from a technical standpoint: In the 2.6.38 and later kernels the bootloader can pass a parameter to the kernel to set the graphics mode (i.e. color depth and resolution), and that way the kernel can throw up a nice boot animation or graphic that is in the resolution you ultimately are going to run on the desktop. This avoids the black flash that happens on older windows and linux when the driver finally gets loaded and switches to the resolution – also called flicker-free booting. The program that does this nice hand-off or whatever is called Plymouth, which when I read about it over on http://www.phoronix.com and about how it would do all this wonderful stuff I thought it was great. But, it appears that it is pretty broken – especially with the proprietary NVidia linux drivers – which are the drivers that do all the neat stuff that I bought the NVidia motherboard to do (i.e. decode video super awesome w/ almost no CPU usage).
So, as I understand things it works like this when you boot Ubuntu:
Grub 2 loads and lets you select an operating system if you have multiples installed (I don’t), or lets you select failsafe modes or older kernels.
In the entry for the Kernel / OS it has some stuff about passing mode info – also there is some default that tells grub what mode to be in during this (so even your bootloader can looks less ugly).
When you start booting the kernel it loads your video driver and passes it off to plymouth to draw the pretty screen so it doesn’t flicker.
Then once that is all fine and dandy Xorg server (which is previously what set your video mode and caused the flicker at this point in older linux booting) gets a hand-off of control of the video driver and shows your shiny desktop. Except on my system it takes a giant crap, doesn’t leave any log messages or errors that can be deciphered, and shows you a blank screen while it hangs – silently crashed waiting in the background doing nothing but driving you insane – like a super zombie ninja with psychic powers of doom.
Anyway, I digress. I spent like a million hours trying to get this to work before downgrading back to 10.10 (and then having to re-fix my NVidia motherboards lack of audio out over HDMI – could we get this kind of crap fixed in linux please – audio and video are a giant fat mess – like 100 times worse than the switch to windows Vista, only it works way shittier and never gets fixed even after 5 years of running linux – Lame!). And, after all that, all I have to show for it is frustration, and a fear of upgrading.
Next time I will do two things differently:
1 – Just create a new partition, and install the newer one to that, and dual-boot – so I can watch Dr. Who or recorded TV on mythTV (new season of Glee! – AWESOME-Sauce!) – and still try out the new stuff to see if it is broken.
2 – back up my config files I change to my server. In general I’m starting to work up a real backup plan – and one of those things needs to be getting config files that I’ve changed or setup identified, and then backed up to a seperate folder which is backed up to the cloud. I’ve got a lot of hours into configing my server (in particular) and it is just built up over the last couple of years of just screwing around and making stuff work. Does anyone know a good way to go about that? Particularly, I don’t want to back up binaries, or version-specific files – but I do want to be able to upgrade and not have my stuff break (I’m looking at you ejabberd). And I want a good way to automate that backup (and I’m really getting annoyed w/ rsync, so unless you have a magic way to set it up easily, don’t suggest it – plus, in the world of dropbox, why can’t we get more seamless backup solutions for linux that monitor file system changes instead of backing up on a schedule?). And then a way to easily know where to put those files when I want to restore them on a new install or something.
Anyway, that is rambling now – but point is – on number 2, suggestions for things that have worked well for you are much appreciated.
To recap – Ubuntu – horribly broken on my Nvidia system (zotac micro-atx motherboard w/ 9300 graphics/chipset), need to be smarter about how I approach upgrades on critical and appliance devices (i.e. servers and single-function computers like my mythtv frontend) – Suggestions about backup strategies welcome.