Category Archives: All things Unix

Upgrade Challenges: Avoiding the “Microsoft Tax” and Buying American

The signs that the Great Recession is receding can be found in the return of a tradition taken from the pages of the Christian scripture, the Christmas Shopping Frenzy.  We’ve never understood how the one-paragraph note in the Gospel of Matthew–that describes the arrival of the Magi bringing gifts to the infant Jesus–has created, two thousand years down the vortex of time, a world-wide phenomena, a seasonal gifting orgy of conspicuous consumption that transcends and even obscures the religious symbology of the gifts.  Not to mention that the original story ended badly: the local political power (Herod) subsequently engaged in an horrific massacre of male infants in an attempt to eliminate a perceived threat of competition from the giftee, while the gift-givers fled to avoid interrogation and the target of the pogram was whisked  away to a foreign country.

Here at Chaos Central, we observe the Christmas tradition in a more subdued manner: a small celebration with close family who are practicing Christians, with small gifts, some hand-made, and exchanging end-of-the-year greetings with friends.  And, since, by coincidence, the holiday does happen at the end of the calendar (and tax) year, we do evaluate our balance sheet and make a few last-minute gifts to charities, as well as gifting ourselves with any tax deductible business purchases that were on the near-term planning cycle.  A truly secular side to the “shopping season.”

In this topsy-turvy economy, where millions are unemployed but there is no shortage of merchandise made “off-shore,” and the number two and three brands sell well only because the number one brand sold out during the enforced shopping frenzy, it makes sense to consider making small changes to how we buy things, to reverse the economic trends that have brought us to the brink.  Competition is healthy: the existence of near-monopolies stifles innovation, despite the fact that innovation may have created the monopoly in the first place.  That’s one reason we put off buying computers: There are simply too few alternate choices.  It is much too late to “Buy American,” because very little manufacturing is done in this country anymore.  Almost all computers, whether running Microsoft Windows or Apple OS/X, are manufactured offshore.  And, those are the only choices of operating systems in the big-box stores. But, we can at least buy things assembled in America, and with a choice of open operating systems, if we look for them.

In our case, it is time to upgrade our aging stable of computers, reassigning an 8-year-old workstation now suitable only as a graphics terminal for virtual machines (and not very good at that), and a 4-year-old laptop with limited memory and disk size.  We need a laptop suitable for hosting multiple virtual machines and a full multi-media desktop to adequately support our business projects. These are overdue, postponed while we weathered the deepest and most personal effects of the Recession.  Things fall apart.  Not literally, but the thread of progress gets unraveled when there is no growth due to lack of capital.  [Lest we get confused, capital is not profit–what is wrong with much of America is the failure to invest in capital, in an attempt to keep up the appearance of profit during downturns.]  Computers last upwards of ten years or more if properly maintained (we have one, a Sun workstation, still running after ten years, and 15 and 20-year-old “retired” computers that will still boot up), but they are not cost-effective after four or five years, as they simply cannot do the work to remain competitive against newer systems, and are often not able to run newer software efficiently, if at all.

For some years now, adding a new computer running Unix or Linux to the network at Chaos Central has usually involved buying an off-the-shelf machine and stripping off the unwanted default operating software–i.e., the currently shipping version of Microsoft Windows–or ordering piece parts and building a bare machine from scratch, which is possible for desktop machines but difficult in the case of laptops.  In both strategies, the machine is not ready for use until a suitable replacement environment has been installed. Server-class machines can be (and have been) ordered with no installed operating environment, but the choice of portable systems and compatible desktop workstations has been limited to systems manufactured by Apple, running the OS/X operating system, or a wide variety of other Intel and AMD-based machines–all running Microsoft.  While OS/X is a variant of Unix (the Darwin microkernel port of BSD),  GNU/Linux, Oracle Solaris, and FreeBSD are more commonly compatible with the server systems that we administer and program for clients, so those are what we want on our desk and in our luggage.

Fortunately, there is a large enough market, 20 years into the GNU/Linux revolution, so a number of enterprises have sprung up to build and sell systems that run Linux “out of the box.”  A few major manufacturers, like Dell, did offer Linux choices at one time, but for various reasons–too small a segment of the commodity desktop/laptop business at that time to diversify software choices; and/or problems with Microsoft OEM licensing agreements that applied to product lines rather than individual machines–they dropped the offerings, except for the much smaller and more customized server product lines, in which case they only sell licenses and media: installation and configuration is left up to the buyer.

Small-footprint desktop, preloaded with Ubuntu

However, near west coast port cities, like Seattle, San Francisco, and Long Beach, the ready availability of computer piece parts in economical small lots from tier 1 importers makes it possible for small businesses to build custom non-Microsoft computer systems at nearly-competitive prices.  As it turns out, the market for Linux workstations overlaps with the market for high-end video game machines–with powerful graphics, multi-core processors, and lots of memory–so there is a plentiful supply of components, most of which aren’t found in commodity desktop machines anyway, so the price difference is well within reason.

We like a bargain as well as anyone else, but, as a small home-based business ourselves, we prefer paying a little more, knowing that that extra is providing a living wage to fellow entrepreneurs and folks who love what they do, not boosting the portfolio of an executive as a bonus for outsourcing the entire product and support pipeline to southeast Asia.  We bought our new machines, a high-end laptop and a workstation powerful enough to serve multimedia applications, from Zareason, a small company whose owners we met at Linuxfest Northwest last spring, where we got to check out their offerings.

Buying locally-assembled products isn’t bringing back “Made in America” factories, but it’s a start toward turning a nation of consumers into a nation of producers who take pride in what they make with their own hands and minds.    We’ve written here a lot, recently, about our adventures on our “Made in Oregon” tandem bicycle, and we’re now configuring the next generation of Linux computers, “Made in California.”

A TAXing Chore

2010 marked the first W2-free year for the residents of Chaos Central in more than 45 years.  No, we didn’t get laid off from $JOB like so many Americans: after dabbling for years in part-time consulting, we’ve taken the plunge and are now completely self-employed.  While we still have to wait until the end of January 2011 for the 1099 forms to trickle in, we do need to start estimating our taxes early, to make sure our quarterly payments track with the IRS’ estimate of what we should owe.  So, off to the wholesale club to pick up this year’s copy of Intuit’s Turbo Tax, which we’ve been using for the past ten years or so.

Chaos Central is a Unix shop (this is the Unix Curmudgeon’s blog, after all), and the development projects haven’t justified adding an Apple with OS/X to the stable of machines, so TurboTax and other Intuit products do present some problems for us.  For several years, we’ve maintained a Microsoft Windows system for the sole purpose of running TurboTax. Since the demise of our aging Windows installation (see our November post, “5640 Reasons to Not Use Windows” for the whole story), we’ve relied on Oracle’s VirtualBox virtualization application to run Quicken 2010 under a Windows XP license that came with a machine that has long since died.  We also have clone of the XP system that succumbed to a ClamWinAV bug, running under Citrix XenServer.

The system running under XenServer–thank goodness we ran the clone process as part of our evaluation of XenServer–was the one under which we’ve been running TurboTax, so it was a logical choice for this year’s version. Alas, the remote desktop capabilities of XenServer just weren’t up to the video calls that TurboTax uses (what the Unix Curmudgeon refers to as “stupid Microsoft tricks”), so we copied last year’s TurboTax files from that system to the system running under VirtualBox, and TurboTax installed just fine, and loaded our profiles from last year. In fact, it installed remotely, as the VirtualBox server is in the Realizations Fabric Arts studio, and the XenServer system is in the Information Engineering Services office, one floor up at Chaos Central.

Being too lazy to trot the TurboTax CD downstairs, the Unix Curmudgeon simply looked up the block device that the CD was mounted from (sr0), then ran ‘dd if=/dev/sr0’ piped to an SSH session that launched ‘dd of=Turbotax.iso’ on the VirtualBox server. Both XenServer and VirtualBox allow you to use ISO images as virtual CD/DVD drives, so there is really no reason to burn CD or DVDs from downloads to install virtual machines.

This is still a bit of a pain, as running Windows programs in this way requires you to actually run Windows itself. Worse, Windows is an unruly virtual machine, as it tends to gobble up as much CPU and memory resources as you assign to it.  Windows is also an unruly remote desktop server, as it doesn’t respond well to the remote mouse movements, resulting in much frustration, though the presentation on the VirtualBox server console itself is adequate. The virtualized XP installation on the XenServer is there now for the sole purpose of providing the XenCenter console. Citrix apparently intended XenServer to be used for virtualizing Windows Server instances, so the XenCenter, naturally, only runs on Windows.

Our preference, if we must run applications that are only available for Microsoft Windows, would be to run them under Wine, the WINdows Emulator, an Open Source tool that runs under Linux, that translates Windows system calls into Linux system calls.  Unfortunately, many applications rely on Microsoft intrinsic shared libraries (DLLs) or use so-called “undocumented” tricks to perform well under Windows, so they can’t readily be run under Wine, which is a feat of reverse engineering that dozens of Linux programmers who can’t give up their favorite Windows-based games or killer applications (like Adobe Photoshop) have devoted much time to getting to run under Wine, by trial and error.  TurboTax is one of those applications that, since it incorporates some direct memory accesses as part of the protection mechanism (an example of an Undocumented Stupid Microsoft Trick), just can’t be run in emulation easily.  But, since virtualization emulates the Windows hard drive, these tricks can be safely executed in the Unix environment without harm to the host or the client application realizing the disk sector it was writing to was simply a block of bytes in a file on a larger system.  The other trick is to successfully map Windows video tricks onto a virtual video card and then translate it into an image that can be displayed on an X Window system terminal.  Some applications that make extensive use of the Active-X protocol, of which Microsoft is so proud, render “active” regions in such emulations as blacked-out areas on the display.

So, you say, “Why don’t you Unix Curmudgeons just use Windows, as $DEITY intended?”  Because, dear reader, whereas the Microsoft Vision is “One user, One computer,” for us, in the words of the vision of the late, great Sun Microsystems (now part of Oracle), “The Network Is the Computer.”  The Windows user experience is limited by system settings for “desktop” or “server” priorities, while Unix systems can be fine-tuned to meet almost any unique computational environment, and the fundamental philosophy of Unix promotes equitable sharing of resources among hundreds of processes, which may be owned by many different users, with complete security.  On my screen at the moment are complete graphical desktop presentations from a half-dozen different computers running different versions of Linux, and individual applications running on those and other computers, from which I can cut and paste text as if they were all running locally.  This is a concept of which most Windows users cannot even conceive, even though the web browser, that universal window into the broader network, offers some of this capability. We grudgingly use Windows only for applications for which workable equivalents are not available in Unix, and will stop using it when they do become available for our preferred systems.

Practice Makes Perfect

The “10,000-hour rule” was coined by psychologist Anders Ericsson in reference to violinists, but popularized by author Malcom Gladwell to be more inclusive.  The rule states, essentially, that mastery of a given field requires about 10,000 hours of dedicated practice.  For musicians, this is certainly true, given the time it takes to understand the nuances of one’s finely tuned instrument, plus commit difficult passages to muscle memory and to correctly interpret the composer’s auditory vision of the score.

But, what of other professions and activities?  What constitutes dedicated practice in other fields?  The 10,000-hour goal in the world of work is about five years of 8-hour workdays.  In most professions, five years is the entry-level experience for a “senior-level” position, what I would call “journeyman-level” in a skilled trade: someone who can handle routine tasks without supervision or guidance.  The next level of performance is measured at the eight or ten-year experience level, given that probably a third to half of the average work-day is given over to socialization and the business of doing business, during which we are not practicing the skills of our trade.

For the profession of system administration and similar technology pursuits, the 10,000-hour rule is even harder to judge, since mastery of the art is a moving target.  I personally have almost 30,000 hours of paid work as a systems administrator, of which we can apply about half, or 15,000 hours, to “practice,”  plus a few thousand hours of actual practice and study specific to the skill, prior to and outside of “on-the-job” time.  Any muscle memory part of the skill set is in keyboarding, which carries through, and the Unix philosophy set down by Thompson and Ritchie in the early 1970s remains to be thoroughly understood and practiced.  The analogy to music further maps the idea of mastering a composer’s body of work to mastering the various “flavors” of Unix: Solaris, BSD, SVR4, AIX, HP-UX, and, of course, the varied Linux distros.

This analogy breaks down when we consider that the body of work refuses to stay static: Unix is a constant sequence of variations in the form of upgrades, after which the original is never performed again.  Software and hardware are intertwined, where older editions of systems software can only be performed on obsolete and non-repairable hardware (though emulation can eliminate that obstacle), as if a particular work of music could only be performed on a single, fragile instrument.  So, mastery of systems administration is a dynamic process: virtuosity requires constantly learning new works (software) played on new instruments (hardware systems).  I don’t think 15,000 hours constitutes mastery of the art, divided as it is between several major versions, and with time split between straight system administration and programming, in an ever-changing environment.

Yet, mastery requires practice, meaning exercises in the fundamentals: in music, scales and fingering exercises; in systems administration, scripting, building kernels and applications from source, and testing.  These activities are the muscle memory of system administration, the automatic responses that facilitate solving difficult problems that require planning and reading the manual–the ability to apply resolution promptly and with skill.

Which comes to  the other side of expertise: in order to master a given skill requires physical stamina and flexibility, as well as a clear mind.  For the past 33 years or so, my outlet to balance the rather sedentary computing life has been bicycling.  Now, masters of the art like professional bike racers and world-class bike tourists accumulate 10,000 hours of practice within five to ten years.  As a bike commuter and accidental bike tourist, a mere 50,000 miles–my estimated lifetime accumulation at present–at perhaps an average speed of 14 miles per hour,  constitutes only about 3500 hours of practice, far short of mastery of the skill.  Worse yet, the past year has seen the end of bike commuting, since I now run my professional practice from home.  Now, I should be able to take my former commute time and recreational biking time and use it for practice, but the incentive just isn’t there.  Until one realizes that, like system administration or music, regular practice is essential not only to achieving mastery, but to maintaining a reasonable level of competence.

A bit over six years ago, I was inspired by a true master of the sport, Shirley Braxton, who, with her late husband Sam, had owned a bike shop in Missoula, Montana, and, in the 1970s, did much to start the bicycle touring industry by building bikes designed for loaded touring.  Shirley, then in her late 70s,  made a practice of capping the bicycling season with a “birthday ride,” of length in miles equal to her age in years.  For a few years, this was motivation enough to get out and train through the summer, building up to an early fall 60-odd-mile ride, even to the point of participating, as a training ride, in three MS Tours, a fund-raising ride to benefit Multiple Sclerosis–originally a 150-mile ride in two 75-mile stages, but, because of popular support from relatively new cyclists, has for a number of years offered 50-mile stages, which course we prudently chose.  As with any pursuit of mastery of a skill, the tools and instruments are also important to achieving the goal.  In our case [meaning, of course, the Nice Person and the Unix Curmudgeon], we ride a mountain-style tandem, which, like it’s single-seat relative, has a heavy frame and oversized tires, intended for trail riding rather than racing or touring, though we have toured with the tandem (see the other blog articles in our bicycling category). In any case, swift passage and high mileage are not the strong suits of such machines, though they generally come with very low gearing that compensates on hills.  Or so we tell ourselves.

So, with advancing age, both myself and my ride–a 1996 Specialized Hard Rock, entry-level commuter/mountain bike, with fat tires, steel frame, and no suspension–together we’ve tackled the “birthday ride” solo for the past three years, the Nice Person having come to her senses about the effects of inadequate training, leaving the Unix Curmudgeon to huff and puff his way alone through what became the longest rides of the seasons by double.  Without benefit of daily sprints across town to work, and with the last ride a month ago, setting out into the foothills of the Olympic Peninsula on a nearly 70-mile course was less a demonstration of ability and skill than a simple grueling test of endurance.  But, some of the lessons of nearly 35 years of all-weather commuting and touring held forth:  keep hydrated, keep fueled, put on rain gear when it starts raining, stow it when it stops to avoid overheating, don’t push too hard early on, and get off the bike for at least a few minutes every hour to eat, drink, and rest the legs a bit.  Oh, yes, the thought of aborting the mission did occur, once or twice, but we can report mission accomplished: 69 miles in 7 hours 52 minutes, an average speed of 8 and 3/4 miles per hour, including pit stops.  That’s just about the daily limit for a mountain bike, and one of the reasons we declined the 75-mile route on the MS Tours–there was an 8-hour time limit on that course, and it is important to finish.  Reasonable goal-setting is also a part of developing mastery of a skill.

Next year, the goal is to ride often and ride long, work on a more balanced fitness regimen, lose a bit of weight, and get a lighter, faster bike.  The rest of this year, the goal is to work more with server virtualization, hierarchical storage management, and keep up with the latest systems releases, on the work side.  Practice, practice, practice.  No matter how old you get, there is still more to learn, and roads to explore, perhaps at a more leisurely pace.

Broadcom Wireless on Linux, redux

A couple of months ago, we posted an ongoing saga of getting the Broadcom wireless to work–again–after updating to Ubuntu 9.10 on our Compaq C714NR laptop. It was one of those trial-and-error issues we’ve gone through since we got the machine back in ’07 and first loaded Ubuntu 7.10. But, with 9.10, not only did we get the wireless to work again, but the whole process of connecting with hot spots was simplified. What used to be a grueling test of endurance and exercise in command-line prestidigitation was suddenly as simple as using an Apple, with the new Network Manager applet installed.

Well, all good things must come to an end. Last night, I started Update Manager, which included a kernel upgrade. This morning, the reboot came up with the little antenna icon on the laptop red, and the Network Manager icon showing no signal, wired connection only. Yow!

So, dust off the memory cells and google up the Ubuntu forums. Um, need to reload the bcmwl-kernel-sources. Nope, that causes the machine to freeze on boot. Boot to rescue mode, remove the package (dpkg -r, at a root prompt), then reboot and regroup. Ah, last time, we got the source package directly from Broadcom, compiled it and installed. As usual with any open source product, we ignore the package we already have on the machine and download a fresh one from Broadcom. Sure enough, it had been updated, shortly after we downloaded it last time. Following the README.txt file, we are soon rewarded with the spinning icon, a “connected” splash, and the welcome antenna-with-four-bars. Back online, then a few tweaks to make sure it boots with wireless enabled (copying the driver to the current kernel driver directory).

Lessons learned, or, in my case, relearned, since I’ve known this practically forever (Linux user since 1996, Unix user since 1989):

  1. If you have compiled drivers not included in the distribution, you must recompile and reinstall them each and every time you update the kernel.
  2. When you install any open source package, always check for updates, especially if you have updated your system since you last downloaded it.
  3. Do the above before you reboot your machine after a kernel update, else you may be scrambling for a rescue disk when the machine doesn’t come back up.

Meanwhile. we’re waiting a while before grabbing the new 10.04 Ubuntu upgrade–patch downloads were very slow on Release Day. I hear it is worth the wait. Linux is almost ready for your grandmother’s desktop. Grandma already has hers at Chaos Central, and has since RedHat Linux 7 (she’s running Ubuntu 9.04 x86_64 now), but only because she has a full-time system adminstrator (Grandpa, aka The Unix Curmudgeon).

Ciao, and happy computing with Linux…

Unix at the Core — Still Viable

Yesterday, the nice person and the Unix Curmudgeon were prowling our favorite thrift store (the one we donate our unwanted stuff to), and we came across a shelf of old Unix books, mostly O’Reilly titles.  Some were a bit dated, like “High Performance Computing,” which, today, means huge Linux Clusters on multi-core AMD and Intel CPUs, but in 1993 meant RISC processors like Sun’s SuperSPARC and HP’s PA-RISC.  But, the book works on core principles, so many of the chapters are still valid advice and technique for tweaking programs on any computer, so it goes on the shelf as a reference and memory jogger.

The real prize in this blast from the past is a Hayden Books tome dated 1987, UNIX Text Processing, with Tim O’Reilly’s name on the by-line, as co-author with Dale Dougherty and acknowledgement to “the staff of O’Reilly & Associates.”  Much of this thick book is a crusty introduction to Unix and basic utilities, as the target audience are word smiths and other folks who may not have heard of Unix,  rather than programmers and administrators.  But, at the chewy core is a review of the Typesetting RunOFF system (troff), with all the popular macro packages, plus the pre-processors tbl, pic, and eqn.  As a bonus, the authors show us how to use those venerable command-line tools like awk to slice, dice, and embroider the raw markup.

“Big Deal,” you say.  OpenOffice Writer does all that stuff automatically.  Markup languages are out, WYSIWYG is in–in fact, so ubiquitous that nobody calls it that anymore.  Well, markup still exists, though many of us, myself included, often use OpenOffice and content management systems (like WordPress, with which this blog is written) to create and edit documents, though I find myself flipping over to the “source” view and tweaking the HTML manually in nearly every post.

But, wait–there was that word: manually.  What if we have a document format in which we want to generate documents automatically?  Oh, right–we can generate XML and write a parser for it, or write PHP code that converts database tables into HTML tables, and we can even use PDF libraries to write output directly to PDF.  Still, for a lot of documents, writing complex scripts is overkill, and needs a lot of debugging.

For a lot of ad hoc documents–prototypes or one-time reports, it may be much simpler to write a template for a troff macro, populate the template with our data using a simple filter script, and pipe it all to the Unix text formatting utilities.   With pic, you can even automatically generate graphs from your data, and embed them in the output.  The advantage of using the text formatting utilities is that you are in the realm of the core Unix philosophy–piping data from one utility that does one thing well to another, so you can easily add filters on the front end or backend to adapt to different data sources or different output formats, or quickly change the layout.