Fixing my Kubuntu 12.10 upgrade

No Gravatar

So we use kubuntu at work to build some of our linux-based software at my new job – which is kinda awesome cuz I love working with *nix-based environments (unix, linux, bsd, mac OSX, etc) for a lot of really nerdy reasons. This is also cool because I use the big brother of the kubuntu distribution – Ubuntu – at home for all my linux PCs (my DVR server and the media PC hooked to the TV).

But one of the down sides to linux is that occasionally you get weird problems on upgrades or installs – and being that I jumped the gun ahead of our gurus in updating to the recent 12.10 update to (K)Ubuntu, I of course ran into one.

I had an issue where on login I got a bunch of error messages about my audio device being missing as well as my network interfaces, and then when I tried to run the update manager it would open and show updates, but not let me install them. When I clicked “Install” it would just sit there and spin the busy indicator and grey out the background.

Well, I chased this around for a while with no meaningful error messages to go off of, and eventually I stumbled upon what might be causing it. There are some packages that are needed for the window manager (the thing that draws the windows and the taskbar and what-not) to be able to pass messages around, and at least one – probably both of the packages I’m about to mention – seemed to have gotten out of whack somehow.

So the trick to fixing it was simple – reinstall the packages for dbus, and policykit. So in ubuntu or kubuntu hop onto the command line and type:

apt-get install --reinstall dbus
apt-get install --reinstall policykit-1

And that seemed to fix it for me. Hope if you have a similar problem that this fixes it for you without all the searching around the internet.

Ben-tenna project 2.0

No Gravatar

On Friday my MythTV server (my DVR software running on my server) had major audio and video artifacts and skipping during Dollhouse. This isn’t the first time this has happened and is particularly annoying since the TV upstairs rarely, if ever, has issues with Fox. Well, this was the last straw for me.

I know this was one of, or a combination of three possible issues.
1) Antenna is not good enough. It is a good antenna that I built, but not outdoor worthy, and where I have it in the house had to have multi-path issues.
2) Crappy tuner card. It is a cheap card and I know it isn’t the best tuner, but I didn’t want to shell out for a nice Hauppage or something similar. Anyone have any good Linux/MythTV compatible card recommendations that are under or around $80?
3) Hardware / hard drive issues. My server is an Athalon XP 1800 on an Asus A7V8X with a Seagate 300gb SATA with the recording partition formatted XFS. I’m a little worried that there are some I/O issues getting the data off the PCI card and onto the hard drive. Ideally at some point I would get a PCI express runner card on some dual-core (hopefully lower power when idling too – want to green this setup a bit w/ my P4 3.0 Ghz frontend PC).


So, I decided that tackling the antenna would give me the most bang for the buck long term (especially if I can get it permanently mounted on the roof or outside).

So, I was debating between a dual-bay version of the one I made previously (based on a Channel Master CM4221), or to go with this newer and GPL-licensed Gray-Hoverman. Did a little homework and decided on the Single Bay Gray-Hoverman despite it being slightly more complex to build well when compared to the dual bay channel-master style. So, did a little internet poking and found this site which documents a guy building his out of PVC.

I didn’t deviate too far from the original design plans for the active elements of the Gray-Hoverman, and copied his frame (only with a single-bay version) almost exactly.

It cost probably close to $50 by the time I bought a pipe cutter and a hack saw (how did I not have one of these before????), so the PVC and screws and washers probably came out to more like $20-$30, and then the copper wire wasn’t cheap, but I bought it a long time ago in anticipation for this project and don’t remember how much it was, and the chicken wire was laying around from our “de-mousing the vents project” a year ago.

Gallery should be attached below of some of the work-in-progress and completed builds. I have only tested it a little and it gets OK reception sitting on my couch. If I put it outside on the roof (that sounds like a PITA though, and I’m lazy) I bet it will do better than any of the old ones I built, and it will stand up to weather better (just needs some outdoor PVC paint first).

I’ll put mroe details about the performance for local channels after I get time to compare it to my older ones built out of cardboard, wood, and coat-hangers. I’m expecting good things, but those old ones didn’t suck. What will suck, is if this tuner on this HDTV card really does suck that bad, and this doesn’t help very much. (for reference I have a Kworld ATSC-155)

And finally, before the pics, some links for more interesting info:
Two articles about building the antenna from some RF magazine
http://www.tvtechnology.com/article/68820
http://www.tvtechnology.com/article/68436
And the PDF for the GPLv3 version of the antenna with the rod reflectors/passive elements.
http://www.xmtr.com/articles/RF182-fig1.pdf

Paradigm Shifts in how we Use Data and Voice

No Gravatar

You may have heard some of these questions (some from me) or thought of them yourself:

  • Why does only the iPhone have visual voicemail?
  • If I can make calls for free using skype from one computer to another why can’t my phone automatically know if I’m calling a computer and switch to that when wifi is available? (if the phone has wifi built into it)
  • Why is instant messaging free, but texting costs money when they are essentially the same thing except on a computer instead of a phone?
  • Why don’t my facebook notes comments sync automatically or easily to my professional/non-facebook blogs comments and vice-versa?
  • Google voice seems awesome, but why doesn’t it integrate better into my phone, and why hasn’t my carrier done something like that before (or at least integrated it into the phones)?
  • What is going to replace email? It was invented in the ’70’s and hasn’t changed much since then. Why haven’t we thought up better ways to communicate yet?


Maybe I’m weird, but I actually think about these questions a lot. These questions plague me more and more as Google is routinely blowing my mind with awesome products like Google Voice, and Google Wave (especially Wave).

It seems to me that for a while we have been on the cusp of a paradigm shift in how we communicate. I know there have been lots of major shifts and new shiny things out that expand how we communicate or consume information, and for some that has been paradigm shifting. But for some people like me, it feels as if there is a lack of cohesion and depth to these products.

For example:
I email and chat with people using gmail (Wave is going to combine that with document sharing, and some other neat stuff, go check out the video on their page if you want to see a demo)
I leave public status messages / funny or quipy site links on twitter, which get sucked into facebook. Comments on facebook don’t get posted to twitter though, and replies in twitter don’t turn into comments.
I text and call people with my phone. Soon Google might transcribe that for me automatically.
I can IM people on my phone now, and even get messages while I’m not in that program (yay push notifications – I envy the palm Pre users for their native ability to run an actual IM app in the background). But I can’t interleave that with my texting unless I’m on a palm pre.
I post blog posts on my blog which get sucked up into facebook and livejournal (which is not as straight-forward as it should be to set up IMO). But, no comments or replies sync between sites.
All of these different sites and devices have different log ins, different ways of handling what is basically a flow of information to and from me from and to other people.

Why don’t we have a way of integrating these together?

There are lots of reasons why, and I have lots of opinions on the subject, but rather than bitching or pondering how the system has failed I’m interested in possibilities.

So, here is what I’d be interested in:

– Something akin to google’s new Wave product as a base, meaning an open protocol designed to track flows of conversations and discussions. Only bring in google voice so it has my call logs and texting history right in the flow of conversations.
– Voice calls have to fundamentally shift from being a perceived as ‘audio’ to being perceived as data and an important part of the flow of the conversation. That means, that a call should be logged into the flow of a conversation. (what conversation – I will get to in a second)
– The ability to record or transcribe calls would need to become a common addition (and probably optional, all conversations could go “off record”). So in a conversation it would either mention that a call was placed, or it would have a recording or a transcription embedded into the overall flow. Same for video chats.
– Open authentication – not facebook connect style, but more openID style. And the ability to associate your phone number(s), email addresses, and tie them across servers would be paramount. This means if you have an account at livejournal, gmail/google, and facebook, you could tell them to synchronize your credentials. This would then have facebook check your listed number against googles, and if googles is different both places would add the different numbers as secondary phones to your account. And, then livejournal would be able to access that and mention that you have a blog in your ‘sites’ profile. Also you could use any address (facebook user name, google email address, livejournal username) to log into any of the services. This is basically an expansion of openID in my mind.
– Complete control over permissions and accessibility of data. Privacy is important and I think facebook gets this mostly right. You need to be able to say “Only people I authorize can see detail X, and unless I give them special permission they can’t see Y, but everyone can see detail Z.”
– With permission, people could add you as a contact and then see whatever information you make available to them, like facebook, only synced across multiple sites (like a few points above mentions). So if I add friend “Bob ABCD” to my phone, and Bob approves me as a friend, my phone would automatically have whatever data he is willing to share with me and has on his profile. (Or alternatively, I could augment it with manually entered data that is stored only in my contacts, not pushed to his profile he shares)
– Contacts already approved and stored in your contact database and associated permissions for contacts need to be non-centrally stored. Meaning if you get a new phone, or decide to switch from yahoo to gmail cuz you like it better, you can just log in and tell it to rebuild your contacts. This information should be stored in a fashion similar to your login credentials, and people could only see your contacts/friends or a subset of them if you give them permission.
– Client / server model would have to be de-centralized. Multiple servers across multiple domains need to be the primary place you go for tracking this information. So if you prefer Yahoo’s interface, no problem just use yahoo. Or if you prefer facebook to manage this, plus all the silly quizzes and memes, use facebook. But it all syncs across boundaries. Meaning your cell phone provider is still the gateway to your cell phone, but not the only keeper of all your history and contact information (and neither is your handset/cell phone)
– Micro blogging / status updates and follow-up comments need to be seen as a conversation, and should span across boundaries of services.
– Blog posts could be the beginning or continuation of a ‘wave’ or thread of a conversation, and comments would be carried across site boundaries.
– A push notification framework would be required, but it would have to have the ability to filter so you only get notifications of things you want (texts/IM, phone calls/audio chat/video chat, emails). Drinking from the firehose is fun for about 2 seconds.
– Hardware agnostic, meaning that the connection you use would be data-based. Not audio based like a cell phone network, which then has a data network laid over it (like texting is).
– Conversations need to be able to take different forms. We need the ability to tag by topic, and sort that, but also by person being communicated with to produce a chat/IM/txt like history only with emails and calls in there as well. But we want to also be able to start off a topic of conversation and just follow where that goes like how gmail threads email chains (again with everything else put in it) A conversation/wave/whatever you want to call it, could also be filtered by type – microblog (twitter/facebook updates), blog, email, chat. But the advantage here is managing it from one location.
– A good standardized codec for voice and video chatting. Something that takes into account the connection someone is on, and device they are using. So on a cell-phone data network you could stream lower quality video, prefer good voice, and use a smaller resolution. But on a PC with a high-speed cable connection it would give you higher quality everything with a bigger resolution.
*Neat idea as an add-on*
– Incoming package notification. Someone shipped you a package, or you are buying something from amazon. It could find that out via you associating address info with your login credentials, and a discovery API for connecting to the UPS/fedex websites.

I do believe we are headed to a place like this with our communication.

And I think much smarter people than me have already been working on some ideas like this for a few years (if google’s continued desire to buy-up innovative companies and create game-changing products is any indicator). And, I think with Apples obvious lack of love for AT&T constantly trying to pigeon-hole the iphone (no data tethering!?!??!) I believe we could see a shift to handset makers building phone/data devices around such a protocol if there were one.

So, does anyone else have a suggestion about something like this, or a way to make it more interesting or useful? Or do you look at my suggestions and wonder what the hell I’m talking about and just wish I would only send this to nerds so it didn’t show up on your facebook/livejournal page? 🙂

Being Green is about more than being Green

No Gravatar

Lauren and I have been trying to do better at recycling (unfortunately Minneapolis makes you sort it, so I forget a lot) and we have probably cut our actual trash in half or better in the last 6 months.

Also, I’ve been doing a lot of work over the last year for the solar market (CIGS solar to be exact), and I’ve been attending a lot of talks on Green buildings (particularly churches) at the conferences I’ve attended and I’ve had an interesting realization.

Being Green, particularly the principal of understanding where things you consume come from, and where the waste and other end products of consuming those things goes, and who that affects, can be really relevant to other areas of life.

I’m sure that is obvious to a lot of people, but one particular application that this realization has sparked in my mind is acquiring and making general decisions about technology.

To complete this exercise I’ve come up with some categories and related sub-categories that have become important for me to fill before making a recommendation or decision.

Consumed by me/church/organization:

  • People time – How many hours is this going to be used? How many hours is it going to take in training / setup before this becomes useful (the wife test – phase 1)? How many people besides me will actually find this useful (i.e. the wife test – phase 2)?
  • Energy – Is this electronic going to be an appliance (i.e. always on)? How much power does it take to run this thing?
  • Footprint – What is this made of? Where did that come from and how far did it take to get here?
  • Monetary – What is the up-front cost? What are the costs associated with basic operation (relates at least somewhat to Energy)?

Produced / Long-Term, short-reaching effects:

  • Footprint (again) – When it is finished being used is it something we can give away that will be useful for someone else? Can we recycle it or any parts of it? What is the landfill impact on the parts that can not be recycled?
  • People Time – This is the time that people spend finding bugs, work-arounds, posting on forums, etc. This cannot be underestimated, as a lot of the power of great software and hardware comes from real-world usability studies (i.e. people using it and sharing their thoughts on it)
  • Lock-in potential – Where are we locking into something that limits us later? How much will it cost to migrate to our next level / next growth faze. Is that avoidable? How much will it cost to move to the next place?

The Long-term, long reaching effects – AKA the Ethanol effect

  • The money chain – your money goes somewhere, which then goes somewhere else, and so on and so on. How far up that chain can I see? Is that money going somewhere I’m not happy about? Is it failing to go somewhere that I want it to go?
  • Standards vs. Proprietary – People and the church can really help influence trends. Does the thing we are interested in promote inclusion or exclusion? In other words, does it help innovation or hinder? (This could be technological innovation, or business innovation in the global market, or any type of ‘innovation’ or creative thinking that helps us keep moving forward in justice for more people and making the economic pie bigger for the world). Are we helping monopolistic lock-ins and proprietary technologies, or are we promoting technologies that try to make that as simple as possible? Are we creating a market that locks out other countries and people so that only wealthy countries / people can get in on the benefits of local innovation?
  • Lost Opportunities for ourselves and others – Are we promoting something that has consequences that can impact the market to cause horrible things for other people (i.e. ethanol = re-grow-able gas = food in our gas tanks = increase global food costs…there are other examples, but that is the most obvious right now) If we sacrificed a little would it create a greater opportunity for justice in other places (other communities, or countries)

I want to save some more in-depth discussion of individual points above for seperate posts (lots to unpack), but I want to highlight the part that I call ‘The Ethanol Effect.’

That part is really hard, fairly abstract, and I believe something we should try to tackle from a biblical standpoint. This thought is an abstraction of the passage in Deuteronomy that says:

When you are harvesting in your field and you overlook a sheaf, do not go back to get it. Leave it for the alien, the fatherless and the widow, so that the LORD your God may bless you in all the work of your hands.

Basically meaning that sometimes we have to make a few sacrifices and not always squeeze the last possible immediate gain so that we can bless the most oppressed and forgotten amongst us. Then, we subsequently receive more long-term blessing from God.

In a global economy I believe that the aliens, the orphans, and the widows from biblical times equates to the modern social outcasts in our local community, the global south, and the people suffering under oppressive tyrannous governments.

And since we know that as individuals we part of the trends in spending and subsequently global economics, we should as Christians try our best to consider the far-reaching implications of our decisions.

That, is one of the primary reasons I support more open-source and particularly standards-based technologies….but again, that is a topic to be expanded upon in a different post.

So, let me know what you think. Am I over-thinking here, or is there a way you think that I or we as global citizens could be more ‘green-minded’ in areas that aren’t necessarily considered green.

Standards Matter

No Gravatar

To most people computers are about accomplishing a task. Getting from A to B as simply as they can, so to speak. Almost no one reading this really cares much about what file format you are saving that word document in as long as the person you are sending it to can read it. Likewise, you probably don’t care how these web pages get from the server to your computer, you just care that it works.

Some other related scenarios where you only care that it “just works” that I think we take for granted. Fitting pipes together. If you do any plumbing you care that when you buy a 1/4″ tube and a matching elbow that the threads on both match up. Gasoline octane ratings – it is good to know that the gas you put in your car will work no matter what car you have. Batteries – it is nice to be sure that your electronics will work with either Energizer or Duracell as long as they are AA. It is nice to also know that your wall sockets are 110VAC at 60Hz no matter what house or apartment you are in.

The point here is that everything except the Word document scenario works because there are agreed upon standards that these things are based on. There are a couple of “standards groups” (which sometimes ends up being legislation) out there that certify a written specification and declare it a standard for many different things, and then corporations and businesses agree to make their products to meet this standard. And this increases competition within that market segment, but it also increases overall revenue because we as customers know that we are getting something that will do what it is advertised to do (you can see how this is especially important with gasoline), so we aren’t afraid to purchase these products for fear of not being able to replace batteries or fix plumbing.

Which is why creating standards is an important step in the process of maturation of any market segment, but it is not a step to be taken lightly.

Today, we are all trading things around the web, and to some degree we have let the market sort things out for us. But in some ways the market has decided that public standards would be best for all parties involved as a sector was growing up. VGA and DVI based monitor connectors, MPEG1/2/4 for downloading videos and watching DVDs, TCP/IP for communication over networks and IEEE 802.11 for wireless LAN’s so you can surf the web on your couch are just a few small examples.

These are standards that take the best approach at forward thinking as they can, and as the example of the MPEG show us are also willing to adapt to the changing technology around it. With these standards to support and ease development efforts new innovative technology pops up around them in the form of software and hardware. (The iPod video, as well as the archos jukebox plays forms of the MPEG4 based standardized video for example)

Where there is not a standard there is confusion. Some examples include phone chargers and data cables for phones. If phone companies standardized their phone chargers so that Nokia’s worked with Motorola’s which worked with LG, which worked with etc, etc don’t you think it would be a lot better situation for everyone? We could all charge our phones at eachother’s houses, and eachother’s cars, and our old stuff would always work with our new stuff. They would be cheaper and probably come with little neat features like a retractable cord, or a better power monitor light to indicate the phone is charging…the possibilities are endless. But because the market is vendor locked, it stagnates and prices are fixed to ensure the right margins are achieved when buying a new phone.

But the major area I believe is in need of innovation is in the area of document file formats. Mostly the average person doesn’t care, but a guy like me looks at Word 95, 97, 2000, XP, 2003 and doesn’t see much improvement between them but I do notice that word 97 can’t open a word 2003 file. I notice this because I help people fix their computer problems, and some people (cough*Dad’s Old Church*cough) are still using word 97.

So, a guy like me, who is trying to fix someones computer, or help someone open a file that a different person sent them in Word ’97 is extremely frustrated by something like this because it doesn’t “Just work”

A standardized, open to the public, way of saving a document of formatted text and layout would cure this. Something that was flexible enough to grow and be extended as the market demanded, but explained and detailed well enough that a new start-up or open source software project could implement it into its own software.

This is possible, and that is exactly what a bunch of companies set out to do. Because they felt that people who can’t afford to pony up the cash for MS Office (students for example) shouldn’t be kept from opening and editing files from other people (such as teachers). This should be especially true for the government where everyone should be able to see what they are creating. And a standard means that there is will always be a way to open those files, because in the future after this has died, there is a public record of the standard so it could always be re-implemented.

PDF is a standard that was opened up by Adobe a while ago. Their product still remains as the best high-end PDF creator and reader. But the competitive marketspace for products in the $50 to $200 range has become much more interesting and the small businesses, EDU’s, and non-profits who used to pirate Adobe acrobat to meet their needs now have affordable (and free open-source) alternatives. Because of this, many progressive states have put PDF on the approved list of formats to distribute forms in, while Word is not.

Recently MS realized the jeopardy they are in, especially in government situations, after a head IT guy for Massachusetts declared that MS was out because it isolated poorer people, libraries, and edu’s from opening state documents. There was a big fight, MS threw money around, and that guy got fired. But it scared them since the OASIS alliance’s ODF (Open Document Format) is already an approved international standard (ISO) which means it is acceptable under Massachusetts proposed requirements.

So, Microsoft went so far as to try to get their standard Office 2007 file format standardized with a separate standards body. Most of the technical community welcomed this move cautiously in the sense that Microsoft would no longer be the only ones who could open their office files without a lot of reverse engineering. But many said it was a poor standard.

Here is Jeremy Allison’s take on this subject. He is one of the lead designers / software engineers who wrote the code for Linux to be able to talk to and join windows file-sharing networks and windows NT domains. And Google agrees with his opinion.

But the thing I like about this is that Microsoft was finally forced to out their standard into the public, so now it is up for debate. The community at large (or representatives from all the countries in the standards body) gets to decide whether it is good enough to be a standard, and if it is worth having two standards out their competing for marketshare. No longer is it determined by Microsoft’s monopolistic manipulation of the market and the consumer. No longer will we be forced to upgrade every couple of years just to share files with eachother. If Microsoft wants to change their file format in word 2010 then they have to have it approved by the standards body all over as a revision to the standard (that is if their standard is even improved)

More eyes on things like this only make it better, give it more competition, and benefits us as the consumer. And in my opinion the more things we get out in the open for other people to review and criticize (and use if you are into open-source like I am) the better.

A good collection of software

No Gravatar

This is a CD filled with some high quality free software.

http://softwarefor.org/

I can speak for the windows one in saying I use almost half of those programs on a daily basis, and have about 2/3 of them installed on all of my windows machines, and most of the cross-platform ones I have installed on linux as well. Also several others that I’ve been reading about on their website that I’ve never heard of appear to be hidden gems to me that look very useful and very well done.

Check the Questions and Answers section to see the list of software packages on the CD.

The Open CD is also a good project with a similar aim in mind, but I prefer some of the Starving Students CD software in comparable categories (i.e. I prefer uTorrent instead of Azureus for my bittorrent needs)

Beyond the Catlin MythTV Project

No Gravatar

Now that MythTV works (but alas, it still doesn’t output to my TV, and the HDTV signal is choppy as hell on the P3 1GHz machine), I am thinking beyond it to move my home to the next level.

For a while now I’ve talked about home automation, integrated with the computer network, available from the TV or computer. Something I thought MythTV could be extended to do.

Well, some other people have decided to not extend MythTV, but to build an over-arching program that controls MythTV, Asterisks PBX, some home automation stuff, and a bunch of other neat stuff.

It is called Linux MCE
(More details in this great video discussed here if you are interested)

I’m slowly building parts to work towards this, and here are the details on how.

Did you know Newegg has wishilsts now….how exciting for nerds like me.

Step 1) Get parts for turning current AthlonXP 1800+ machine into a “Core” system for the Linux MCE. The core system does not have to be super beefy if it is not going to run the front-end / video decode. I plan on taking advantage of the parts from Step2, and MythTV’s capability to off-load tasks to other computers for re-encoding video and doing simultaneous recordings.

Parts List for Step 1

Step 2) Get new desktop. I still haven’t decided if this is going to be an Intel or AMD based system. So I’ve spec’d both (I’m still waffling on a couple of things on there, and as always lists like this are subject to change relative to the latest and greatest / price cuts that happen on the market)

Intel Desktop Parts List
AMD Desktop Parts List

Step 3) Upgrade Home Theater stuff. HDTV, receiver, and maybe some new speakers. I don’t like any of the Blue-Ray / HD-DVD stuff yet cuz it is still a stupid format war that I will automatically loose at if I bother to participate. So, I ignore it until the problem goes away and prices drop.

For this I’m planning on something in the 32″ to 47″ LCD (preferably LED back-lit) 1080i or 1080p from Samsung (resolution depends on the prices when I finally have some money for this)

The receiver will probably be Dennon, cuz I like them. Unless something higher-end drops into the realm of affordability.

All these components need to have RS232 or the ability to control over HDMI, because I want the controller to be able to turn things on and off for me with a single remote press (watch the video for LinuxMCE linked above, this feature is pretty sweet)

Step 4) Build Home Theater PC. Focus here is on ability to do two streams of HDTV (I like PIP) but have a very quiet hard drive and fans. Style and ability to go into the stack of media components around the home entertainment center is also important.

Incomplete List of Parts from newegg Need to add a quiet small hard drive (or perhaps a Compact Flash Adapter Instead), a TV Tuner Card, and a case.
The case will most likely be the SilverStone Milo ML02
(Review Here

Step 5) Re-do antennas (or add to my DIY antenna), Cable wiring, network wiring, and buy a distribution amplifier.

Step 6) Rebuild old P3 1Ghz with a quiet Hard Drive and cooling fan for SDtv upstairs / DVD player / security monitor. (No monitor, just a TV output)

Step 7) Plan home security / automation. I’ll start with the Garage Door. It keeps getting left open by all three of us (well, at least Elle and I, since Lauren doesn’t drive right now), and I was thinking it would be great to have a switch in the house, along with an “Open” indicator. Then I’ll add some IP based webcams, and maybe some dimmer switches (can you dim those new florescent bulbs?).

Step 8) Sell house, and start over from scratch 🙂

And there you have it, my next really big multi-year project. I’m just glad I know I can make the mythTV part work going into it.

A good article on migrating to Ubuntu Linux

No Gravatar

Over at Tom’s Guide Daily they are doing an article on switching to Ubuntu.

Link to article

While this is nothing new these days, I was particularly enjoying it even in the first page because of this paragraph.

Have you ever been some place you really didn’t want to be? I mean, have you ever really, really had a desire to leave but for some reason you just couldn’t? There was always something holding you back. Maybe it was circumstance, the comfort of your surroundings, or it was just too familiar (even though you know you should’ve moved on long ago)? Sound familiar?
Conditions like these keep many people tied to Windows. Those users feel there has to be an alternate way, but are unsure how to proceed. Well, there are alternate solutions to Microsoft Windows. Many are robust and allow users to make the migration with little knowledge and no loss. For anyone interested in finding the route out of Redmond, WA, please continue to read on. Today, we’re looking at the new face of Linux: Ubuntu.

Check it out if you have that same feeling. I know I had that feeling last summer when I made the switch, but it subsided after I got the feel of things.

MythTV project – Update 3

No Gravatar

New Antenna is completely built, and working. I had to buy nothing, I had everything in the house that I needed to complete the design from the lumen labs forums that I linked to earlier.

The Pentium 3, 1GHz even with the GeForce 6200 isn’t exactly down with decoding the HDTV stream and displaying it well on the LCD monitor. I’m wondering if it will do better when it isn’t downsampling (the LCD is at 1024 x 768, and 720p is somewhere around 1398 x 720). I have the XvMC set up correctly as best as I can tell from the guides, and it looked a little better after it was enabled, but still it isn’t holding up all that great.

Also, I’m wondering if upping the RAM from 384 to 512 would work better.

But anyway, the cool part is that it is working, and it is doing what it should. I think next for antenna purposes I will either build a better version of the antenna I built and put it in the attic or on the roof, or just keep moving it around inside the house until I get a really good spot.

So, until we get the new receiver or the HDTV this project is pretty much done.

MythTV project – Update 2

No Gravatar

I received the Kworld 115 Tuner on Tuesday. I tried to use this guide on mythTV’s wiki, but that alone didn’t give me everything I needed. Instead, if you follow my path and buy this card and run ubuntu, start here and follow the first 4 steps (ignore the modprobe step) for the ATI HDTV card.

So, I had to reboot, since I’m an idiot at getting linux to reload kernel modules properly. This then yielded lots of weird dmesg errors about readbyte and i2c or something when trying to load the nxt200x firmware.

But the analog signal part of the card worked…so it was at least part way there.

I somewhat figured out that it was a permissions issue. So I went to where the card is located.
/dev/dvb and it was adapter0

and I used the ‘chmod -R 777 adapter0’ command
and then rebooted.

This seemed to fix this error.

Now, I’m having trouble with signal strength, and getting it to lock on any one channel.

A while ago Barney pointed me to AntennaWeb which is a great start for getting antenna orientations down. But, today I found TV Fool, which is much better because it actually lists the real channels for the UHF signals that local stations use (instead of the 2_1 fake names). And it gives much more detail, which I need. See the image it created for me with all this information here. It is pretty freaking detailed….Electrical Engineers like these kinds of details.

Right now, I’m trying to play with antennas. This weekend I’ll probably build my own using these plans or some variation.

And, I’m trying to get a worked up channels list in the format that some video players use, called “channel.conf”.

I have yet to get the scripts to work that are supposed to make them for me. there is one called make_atsc_chanconf.pl that supposedly will do it, but for some reason I’m getting a “stationXML” variable something something error that I didn’t have time to look into deeply this morning.

My plan is to get this channel.conf file all set up for the local stations, and then try to turn on the TV software, and use the visual cue’s instead of the “lock / not locked” cues to aim the antenna. No one seems to have a good signal strength program that I can set to a single frequency and then just leave it there until I get the antenna aimed. And, since I don’t have the antenna aimed I’m getting no signals on any HD channels.

My hope is that this is an antenna problem, and not a hardware problem, but we shall see, and I will keep everyone updated.

The MythTV Project Plan

No Gravatar

I’ve been rambling a lot lately about MythTV, which is an open source project that aims for something akin to Microsoft’s Media Center application for Windows XP (MCE edition) and for Windows Vista Home Premium and Ultimate.

But it is a lot more than that, with an amazingly flexible client / server architecture.

So, I’ve finally amassed enough computer hardware that is old enough I don’t want to give it away, but new enough so that it is not useless, such that I can build a “front-end” or TiVo like computer appliance for my living room.

This journal entry is to lay out what I have in terms of hardware and software (with links for my own reference, as well as to share with you), what things are already done, and what things I need to do. Also, I want to lay out some longer term plans to refer back to, so I can avoid confusion in my own head later.

So, without further babbling, here is what I have.

My personal Desktop – This will act as a dual-purpose computer, being my personal desktop, as well as the myth backend. This will do all the encoding, scheduling, and hold all of the files. Long term, this would be great to move to a dedicated server functioning as a internet gateway, LinuxMCE core, and NAS…but that is a minimum of $500 away and very low priority on the budget.

Desktop hardware:
Asus A7V8X Deluxe Motherboard
768 MB of DDR 333 RAM
Athlon XP 1800 (anyone have a Barton core 2500-2800+ w/ 333 MHz FSB that they would sell or give me?)
eVGA GeForce 6600 GT
KWorld 115 HDTV ASTC/QAM/NTSC tuner card (Newegg has it for $47 shipped after rebate)
Plextor 40X CD Burner (I should finally spring for the DVD burner at some point)
300GB Seagate Hard Drive
and I’m running my 17″ and 19″ NEC CRT monitors that either Dirk or I have owned forever.

Internet Terminal / MythTV frontend – This is my brothers old dell PC.
1GHz Pentium 3
384 MB of PC133 SDRAM
Cheap crappy 8 GB hard drive
Sound Blaster Sound Card
eVGA GeForce 6200 (recently purchased from ebay for $26 shipped)
Lite-On 16X DVD ROM
IR keyboard (the old one I’ve had forever)
15″ LCD monitor that I may eventually add a touchscreen too since I have some touch-glass laying around….don’t ask how I got it.

Then we have the regular compliment of old AV stuff: an old stereo pioneer receiver given to me by a friend, a DVD player that plays DIVX files, a CD player, a VCR, a SD 27″ TV, the Wii, and my new speakers.

Eventually I want to move to an HDTV. And before too long I have find a decent, but cheap (read ‘used, and maybe even something I have to repair’), receiver that does S-Video switching to simplify the issue with the TV-output on the computer.


Desktop / Backend

OS: Ubuntu Linux 7.04: Feisty Fawn
All Restricted Codecs installed
Nvidia Binary Driver (pkg nvidia-glx-new from the repository)
Beryl 0.20 rc3 from Ubuntu repositories (I need to switch to the beryl repository instead)
VLC
Xine
MythTV-backend 0.20
MythWeb backend for management of recording schedules over the interweb
Open SSH server
All your normal compliment of base Ubuntu software

Terminal / Frontend

All the same stuff except no MythTV-backend, and instead:
MythTV Frontend 0.20
All of the Myth-plugins package from ubuntu
and All official Myth-themes.


Currently the backend for mythTV and the frontend on the respective machines is set up and working pretty well. There is a little oddness with the back-end not coming up correctly all the time, or the front-end not being able to find it. Usually if I restart the back-end software (or perhaps start it for the first time) and then the front-end software it corrects itself.

The back-end drive maps to a local directory on the front-end for all the MP3’s and movies and comics and such via an NFS network share.

Samba is set up to share everything to everywhere else in the house so WinblowsXP doesn’t feel left out, and this keeps my wife relatively happy. (long term, I need to get a good rsync script set up to back up her “My Documents” folder to the big drive on the main desktop.


Get the new video card installed for the front-end (should be a simple driver reconfigure)
Get XvMC working on both machines…I thought it just worked through the driver, but I need to re-verify this, and perhaps set them both up to do better with my video cards. (This would explain a few things on my desktop if I wasn’t using the correct settings for the video hardware acceleration for decoding MPEG2 and MPEG4).
Get the Kworld TV Card working with the backend. This is a little more complex. (Link)
Hook the computer output to the TV, and make it work right. Lauren is the key here, if she can work it w/o having me around, then I know it is set up correctly.
Make sure all the myth plugins are setup and work correctly.


First off, must find a good outdoor antenna to avoid problems with recording while not home and the signal is bad. Second, must find a reasonably priced distribution amplifier. I’m thinking a 2 x 8 should cover every possible scenario. Something that will let me configure which output goes with either of the two inputs. And I would like it to amplify the signal such that each output has a unity gain with the input, or possibly even around the +5db gain region through 2GHz signals. Third, centralize networking and media to a single box in the basement. I’m thinking ripping out all the phone cords and using them to pull through network cable, or perhaps use our defunct chimney that goes to the basement to pull cables through. I need to research the wisdom of this one before drilling into the chimney. (Caleb, didn’t you do something like that for your pipes?).

I should figure out if I can actually modify that LCD screen I got from my old job to be a touchscreen. I have the touchglass, and all I would need is to see if I can cut down the standoffs inside holding the LCD panel to allow room for the glass. I would also need a controller for the glass, but that isn’t a big deal.

After we finally get an HDTV in the year 2056, and maybe a better receiver, I plan on moving some of this hardware upstairs to the bedroom, extending the whole server/client thing to actually make more sense. So the infrastructure to do so would be handy to have before hand.

After the new HDTV and receiver, I plan on upgrading my desktop, moving my old desktop to roll of dedicated server / NAS / backend for LinuxMCE, which incorporates MythTV for the TV time shifting / recording functionality. This is when I buy this cool media center case from Silverstone which fits my needs for a silent, small, media center case with an LCD display, so it works and looks like part of an AV rack. This one will have some decent specs w/ good onboard video for watching videos (hopefully full HD content in 1080P by then), and will have a TV tuner in it.

This is the point that if I’m still in this house, I either need to get handy with a fish-tape, or tear it down and rewire the replacement myself. If this fun muti-year project goes this far, I would start ripping out the AC wiring to all the light switches, and brining newer and better wiring through, along with a planned control network (+24VDC signal wires, AC wiring, Analog signal wires in the form of shielded twisted pairs, and maybe ethernet depending on the cost of programmable light switches). I would then terminate everything for lighting, heating, cooling, and anything else I could think of back to a PLC (programmable logic controller), and use an open source, standard protocol, to patch the PLC into the LinuxMCE. (probably use the industrial Ethernet/IP standard, or ModBus TCP/IP). Then, outfit each light fixture, and light switch with a programmable light controller for automatic dimming. And replace the thermostat with a dumb touchscreen which would allow lighting and temperature control. The relays into the furnace and air conditioning would be replaced by the PLC (I need to research whether relay output PID control is appropriate here…I’m guessing not). Also, in the ventilation, and on the windows I would mount stepper motors, to allow for zone controlled heating and cooling with automated dampeners, and complete lighting control by automated shade control.

Also, all the phones would route through the ethernet using VOIP.

That is my pie-in-the-sky part of this scheme…but that would be fun to do. It would also take a really long time if I was working at the time.

Also of note, I’m seriously researching some new stuff about moving to solar power for the house. That would greatly aid my doing this, because I’m sure some of the power stuff would get re-worked at this time anyway.

There you have it folks, my Audio / Video / Home Automation / communications nirvana project. I wonder if I did this, if it would actually be valuable to anyone who owned the house after me?

Ben = Intel Fanboy?

No Gravatar

Yes, that is right ladies and gentleman, after many years of Intel bashing, and hating them because they are the microsoft of the hardware world, I am going to jump camps unless AMD pulls something amazing out of their ass.

I need to timeline a little just to help explain how much I hated intel.

Somewhere in 99 to 2000ish Intel and AMD were in the P3 vs Athlon race to 1GHz. I was a freshman in college taking ECE 110 or some such thing. Both camps used the same memory, and the only archictecural difference between the processors that made any mark in anything was the prescence of SSE in the intel stuff.

Then, Intel introduces that stupid RD RAM crap…I think they even made a P3 chipset for it. It was expensive, proprietary, and those Rambus bastards always struck me as either smug or whiny. In the meantime AMD is mopping the floor with intel in the benchmarks that don’t use SSE, VIA is the king of AMD chipsets.
2000 – 2001 I help dirk build his (at the time) kick ass computer. AMD Athlon 850mhz (best bang for the buck at the time) with the VIA KT133A chipset…king of the PC133 chipsets. I take ECE 290 and learn about pipelining, then go home for a break…I think it was thanksgiving. Intel releases the engineering monstrosity driven purely by crap-tastical retards in the marketing department known as the Pentium 4. The dark black spot in my heart gets seared with the name “Intel” (which has now been replaced with “3 Doors Down”, but that is for another blog). The Pentium 4 is released at speeds of 1.6, 1.7, and 1.8GHz (The 1.4 was the fastest P3, or original Athlon ever made), but the performance of each of these chips is less than the faster Pentium 3’s due to it’s incredibly long integer pipeline. Marketing people scramble in both camps, and AMD decides that their next revision of the Athlon will not carry it’s speed anymore as it’s name for distinguishing purposes, but a marketing number meant to compare it to the performace of the P4, hence the name of my processor in my desktop is “Athlon XP 1800+”, but it is actually a 1.53 GHz processor.

Spring, I take ECE 291 and learn more about instruction sets and how to program for the deep dark parts of the computer…this is the class that means you get to graduate from the Hogwartz school of Dark Computer Magic. I hate intel less, because MMX is cool, but hate them more because 3DNOW was a great idea, and it died because intel came out with SSE. The pentium 4 started to push 2.0 GHz sometime around here.

2001ish-2002, no more compE classes left to take for the lowely EE, so I just go on hating intel because it seems like the right thing to do. AMD comes out with DDR memory based chipsets, Intel notices but tries to ignore it and says RD RAM is the way to go….they are stupid, lawsuits with Rambus complicate Intel’s position.

DDR based chipsets start whiping the floor with Intel again…they look really stupid for sticking with RD RAM and making a 20-some stage pipeline.

I go on a co-op job, I get cash, I build a sweet computer based on DDR Ram with an Athlon XP (now with SSE support freshly licesnsed from Intel). My computer rocks the socks off most things intel at that point.

2002-2003, AMD gets lazy, starts to loose performance to Intel, mostly based on Intels massive amounts of money they can sink into refining their process of making silicon wafers. It is realized that the Athlon XP needs a revamp….and the “Hammer”, AMD’s 64-bit parts, are delayed. Intel comes out with Hyperthreading…which was somewhat visionary although ultimately not very useful in most common applications. Hyperthreading allows two “threads” or two programs to run through the processor at the same time.

2004 AMD unveils the Athlon 64 in the fall, which starts to handlily WHOMP intel again with its onboard memory controller. Brilliant move by AMD, but now the waiting game beings because Microsoft is draggin their feet in getting an OS out that takes advantage of the 64 bit-ness. Intel promptly recants their earlier statements that 64 bit is silly for the consumer right now, and starts to announce their upcoming 64bit processor. I think, once again, that they are thinking with their marketing team, not their engineers….which I say is the equivalent to a man thinking with his little head and not his big one, it always seems like a good idea at the time, in immediate payoffs, but down the road it just burns your dick when you get screwed by making stupid mistakes.

Turns out Intel does something smart this year, and comes out with the Pentium M (the processor for Intel’s Centrino Technology). I begin to wonder if all the smart people working for intel are in Isreal (where the Pentium M was developed), because it is a kick-ass processor. Expensive, but still, it rocks. Good battery performance, good speed, and it is based on the old P3 architecture.

2005, Intel continues to be stupid by releasing the “Prescott” core for the Pentium 4. This ups the pipeline to a whopping 31 stages, and allows for speeds up to 3.6 GHz. Intel has now completely lost it’s performance crown, no one even thinks about buying these chips who have read articles. Even the Intel bonner sucking Tom’s Hardware had to admit that AMD is badass. You can fry eggs on intel heatsinks now, because they are rediculously hotter than AMD. AMD no longer carries much stigma for processors overheating thanks to their “Cool ‘n’ Quiet” technology, and in general just better engineering than Intel, and than they have had in the past.

Pentium M is still sweet, AMD tries to come out with the AMD Turion processor which no one notices because: “AMD makes chips for laptops?!?” AMD needs to stop being smart and start using better advertising, because their chips are really good, but ultimately not much better than that Pentium M so they aren’t going to sell themselves. Intel also makes the deal sweet for people like me at work by giving system builders kick-backs for building systems with Intel Chipsets, processors, and wi-fi cards as a package. Standard marketing thing…microsoft does it too, and so does AMD except they are cheap about it. (all i got from them was a window sticker, at least intel gave me a hat).

2006, Intel unveils the Core Duo, which is a great ramp-up from the pentium-M. I would tell you to buy this processor if you are getting a laptop this year….but it turns out they had another announcement to make this week.

This week, they debuted processors to replace the Pentium 4, Xeon (server chips), and an upgrade to the Core Duo, which are totally freaking AWESOME!!!!

They completely overhauled the entire processor line, and provided a 20% performace increase vs. the fastest, most badass, dual-core Athlon 64 FX 60. Basically they take every single performance comparison by an average of 20%, and these chips will be available for purchase in the second half of the year.

They are well designed, they are susposedly way cooler. The mobile parts take the same battery life as the current core duo chips, but provide a 20% performance increase and are pin-compatible with the Core-Duo, which means that when these chips become available it is an easy switch for manufacturers. These are really made with the consumer in mind, not the marketing executive in mind like the P4. AMD will come out with something to match it, or compete, but it looks like a tough challenge because they are having a hard time getting their memory controller to perform well with the switch to DDR2 memory (something I repeatedly questioned the AMD people about when they made campus visits).

So, now I like intel again, and I basically realized that I can like either company as long as they aren’t trying to screw the consumer like Intel did with the P4 and their strong-arm monopolistic buisness practices in the 90’s.

It is going to be a fun year for the processor wars, just like 1999 and 2000.