Category Archives: All things Unix

Back to Business (as Usual — or Not)

Since starting this blog almost two months ago, the posts have been all over the map, literally.  The underlying thread is our continuing relationship with computers and Unix in particular, but, since we work primarily from “home,”  and often from “the road,” who we are and what we do is all-inclusive.  Computers and the craft of keeping software, hardware, and networks running are infused with how we approach everything, from home repair to textile and fiber arts, metalwork, health and fitness, and travel.

Take our recent trip to Canada.  This was, in the traditional sense, a “vacation,”  but the first thing we did on arrival was to hook up our computer to the Internet.  Sure, we used it to find restaurants and map out our daily excursions, but we also used it to keep in touch with clients and work on web sites, as well as update friends and family on what we were up to.  We had an agenda for our trip, but also some contingency plans to take into account the variable spring weather in the Pacific Northwest.  So, our “vacation” wasn’t the travel agencies’ vision of being pampered and indulged, but a reflection of our ability to make plans and execute those plans.  And, have a bit of fun doing it.

I remember the ads from the early 1990s, when the Internet was just getting notice among the general public and non-computer businesses.  The tagline went, “FAX from the beach?  You will.”  At the time, it looked like it might be another one of those “flying car” or “smart house” ads from the 1950s that never quite happened.  But, the computer is one of those machines that isn’t bound by moving tons of earth and concrete, legislation, and public funding, so anything one can imagine doing with it can actually be tried.  And easily copied.  In the 1990s, I did actually “FAX from the beach,”  but it involved dragging a thermal-paper FAX machine to the resort, a computer with dial-up modem, and patching through to my servers at home through the resort switchboard.  Today, FAX is one of those old-fashioned technologies, like telegrams, and even the most rustic of B&Bs have wireless networking.

So, you can take your business with you, and most of us have, connected to the office or our clients through smart phones or wi-fi, 24×7.  Inevitably, our daily life becomes part of our business, and vice versa.  We’re not going to hide the fact that we’re out on the trail, or remodeling the laundry room when you call for a project update.  When video phones were first proposed, people wrote about having to have fake backdrops so your caller would think you were in your office cubicle instead of on your patio or at a beach resort.  The reality is, people have found that business and pleasure do mix–it’s healthy to be active and social, and, thanks to modern computer and communications technology, you can still be in touch with business projects, and even bring a fresh approach to solving problems, because you are more refreshed and relaxed, and open to new things if you are in a dynamic setting.

We look at these travelogues and non-computer project articles as a reflection of our approach to getting things done, that directly relates to how we approach solving business problems, without going into case studies.  OK, maybe we’re not “on the job” eight hours a day–sometimes it is more than that.  I can’t count the number of times over the past 30 years of bike commuting that I’ve come in from a long bike ride with a fresh insight on solving a problem or tackling a project design.  The kind of “head work” that is necessary in a problem-solving and creative profession doesn’t turn on and off when you sit down at the keyboard, but is the total sum of life experience.

Coda on “Microsoft Turns 35”

Preston Gralla, at ITworld.com wrote a blog about Microsoft Turns 35.  About the subject, we’d quote the Vice President [speaking “off the record” on the health care bill] here, but this is a family blog, so I won’t. Anyway, Preston’s article about the successes and failures of what the Unix Curmudgeon fondly calls “The Redmond Menace” skimmed over a few milestones and features, being mostly non-technical in nature.

Of course, it is fairly well known that Unix predates Microsoft by five years: the Unix Epoch, which is also the time system used on the Internet, starts on January 1, 1970 (GMT, so that zero-time in the United States is always sometime in afternoon or evening of December 31, 1969, thus explaining those SPAMS you get dated 1969–the forged emails have no timestamp, so your mail system puts in “zero” for the “sent time.”)  What isn’t so well known and wasn’t pointed out in the article was that, for a time, Microsoft licensed a version of Unix called Xenix, written for the Intel 80286 microprocessor, which could access more than 1MB of memory and had a protected mode to provide kernel security. Microsoft didn’t directly market Xenix, but sublicensed it to the Santa Cruz Operation. What Microsoft did do was to “borrow” some of the concepts [but not the code–that would be “software piracy,” something that Microsoft has vowed to eliminate by making sure every computer on the planet has a paid-up Windows license] from Unix to incorporate in MS-DOS 2.0.

MS-DOS 1.0, as mentioned in the referenced article, was essentially a 16-bit reverse-engineered clone of CP/M and wasn’t very extensible, especially in handling file systems on large disks, since CP/M was originally an 8-bit floppy-disk program loader, and IBM wanted to add a 5-MB Winchester hard disk to the PC. Subdirectories came from Unix, but Microsoft retained the drive letter designation from CP/M, and reversed the directory level separator, using a back-slash because CP/M (and DOS 1.0) used the forward slash for command-line options and, then as now, backward compatibility was a primary goal, even if it perpetuated bad ideas and awkward constructions. Since the Internet was developed out of Unix networking, the Web uses the Unix conventions, which causes confusion between Windows native file paths and web paths to this day.

Of course, CP/M came along about 15 years into my IT career, and was the system on the first microcomputer I used at work in 1981. In the prior age, when we wrote computer programs, each program was self-contained, and the computer only knew how to read the first file on the tape.  The idea of CP/M, a universal front-end for loading and interfacing with microprocessors, that would load other programs and handle input and output, had a lot of promise, and made it easy for companies to write programs that would work on any computer that had a compatible processor and CP/M: I even attended the big CP/M ’83 conference in San Francisco where the latest database, spreadsheet, and word processing programs were rolled out, still doing rather remarkable things on 8-bit microprocessors. But, by the end of the year, MS-DOS and IBM had cornered the big business market what was essentially a 16-bit reverse-engineered port of CP/M, and CP/M itself was relegated to the dustbin of history. My first MS-DOS machine at home, in late 1985, ran MS-DOS 4.0, which was, as stated, buggy and was missing a lot of the capability added by MS-DOS 5. A couple years later, I installed Windows 2.0/286, which was terrible, compared with the Digital Research GEM (Graphical Environment Manager) that was used as the front-end, under MS-DOS, for a lot of graphics programs in the mid-1980s, and no match whatsoever for the Apple Macintosh, which, like Unix, was a true multi-tasking operating system under the similar graphics. Both GEM and Macintosh were inspired by the Xerox Star graphical user interface developed at the Palo Alto Research Center, and Windows was inspired by the need to kill Apple at all cost (GEM was killable by virtue of being an MS-DOS application that simply wouldn’t run with Windows). Windows was little more than a slow, low-resolution, limited-color task switcher, and few programs were written for it because the programming API was a mess that took months for an experienced programmer to master.

But, since I had an 80286 computer, I could run a real operating system on it, instead of the Microsoft abomination. Xenix, since royalties flowed through the Santa Cruz Operation to Microsoft to AT&T, was extremely expensive, but a decent Unix clone based on System 7–the last in-house version before the deregulation of the telephone monopoly Bell created AT&T and allowed them to market Unix commercially–was available for $99. So, I ran Coherent–for years. It not only ran multiple programs concurrently, but we bought a few text terminals at a swap meet for a few dollars each and the whole family could use it at the same time–it was multi-user.

With the introduction of the 32-bit Intel 80386 microprocessor, which could run standard Unix, SCO broke out from under the thumb of Microsoft and released SCO Unix, letting Xenix die the death it deserved (the 16-bit computers had a 64K data segment size limitation, something that we also had to deal with in Coherent). At home, we finally retired the 80286 and jumped directly to an 80486 when Coherent was ported to 32-bit. By then, Microsoft had released Windows 3.1, which actually had some usable capability, though it was, like GEM, little more than a graphics manager that loaded on top of MS-DOS, and was still 16-bit. But, IBM and Microsoft were building a real[tm] 32-bit operating system, OS/2. Although Unix (Coherent) was the workhorse system at home, I needed to develop MS-DOS and Windows applications, so I bought another computer and immediately installed OS/2 on it. OS/2 was rock-solid and performed well, running Windows programs as well as native code. But, because of the split between IBM and Microsoft, when Microsoft released their own 32-bit operating system, Windows 95, OS/2 could no longer run programs written for Windows 95, and it simply died.

Even mighty IBM was not immune to the wrath of the Redmond Menace–a company that, time and again, exhibited a ruthless policy to either absorb or destroy any possible competition. The easy targets were, of course, companies that depended on Windows and MS-DOS as an operating platform. Apple and Unix continued to survive and even flourish, though their market share percentages shrank as Microsoft swallowed the world with their draconian licensing policies that made it impossible to build a computer capable of running Windows without paying for and installing Windows on it. All of us running Coherent, OS/2, and, later, Linux, not only paid Mark Williams Company, IBM, or one of the Linux distributors–Slackware, Debian, SusE, Red Hat, or others–for a usable system and support, but also paid Microsoft for a useless system to throw away. This means, of course, that, though Microsoft claims more than 90% market share, based on PCs sold versus Macintosh sold, a substantial percentage of those computers shipped with factory-installed Windows are actually now Linux computers. We try not to contribute to the Microsoft market share (or their already-bulging coffers) by building our own Linux computers from spare parts, but, of course, these don’t even show up in the numbers because they weren’t sold as complete systems.

Time will tell whether Microsoft will survive to see 50.  Their flagship system, Windows, has evolved slowly, through blunders and misguided attempts to layer new concepts on top of dead-end designs, layers that have added complexity to the programming and management of the systems, so that turning the juggernaut onto a new track may be nearly impossible.  Apple solved their transition into the 21st century and second quarter century of existence by porting their trademark user interface onto a solid, standard, and open operating system–a version of BSD Unix named Darwin, though it results in one of the stranger flavors of Unix to administer.  I can’t see that happening with Windows–it is just too alien and too convoluted to fix by running it on top of Unix.  Rewriting the NT kernel one more time to fix the security issues from the bottom up instead of the top-down weekly and monthly and emergency band-aids Windows users are subjected to would be a massive undertaking, and we’ve seen 10 years of effort since Windows 2000 to get what seem to be minor refinements and even some major setbacks, with the Vista fiasco as a prime example.  And, with the improvements in automated installation of Linux and the renewed visibility of Apple in the music and phone marketplace, users know they have a choice, and don’t have to endure years of frustration in hopes of getting a system that works to replace the one they have.

What a Difference a Gig(abyte) Makes

Most commodity computers come with too little RAM.  Manufacturers don’t deal with your data, they deal with price points.  So, your new computer most likely came with just enough RAM to boot up and run whatever demos the vendor bundled.

At Chaos Central, the Unix Curmudgeon (and the nice person he lives with) run mostly Linux and Solaris.  But, we do have one machine running Windows XP for those got-to-have programs (like Internet Explorer, for testing CSS code and fixing IE bugs in it, and some other programs that use Microsoft’s DLLs to read/write database files).  The desktop machines and servers aren’t much problem, because we build them to order, but laptops and other preloaded systems have the lowest-bidder issue with RAM.

Recently, after installing Google Chrome on our Linux laptop, we started noticing a bit of thrashing going on with 30 or 40 tabs open in Chrome, along with the usual and customary background of MySQL, PostgreSQL, Apache2, and other trappings of a developer’s system.  The aging HP-Compaq came with Windows Vista, which we used long enough to download and burn a copy of Ubuntu 7.04, and 1GB of RAM, which did not make Vista look like something you could actually use, but was adequate for Ubuntu, pre-Chrome.  But, finally, the upgrade-or-replace issue came to the fore, and, with recession in full recovery and housing sales not so much, we elected to upgrade, bumping up to 2GB.  Aha, that is the sweet spot.  I can now run for several days before Chrome or Firefox gobbles up enough memory to initiate that feeling that your computer is executing machine code off the disk, which it essentially is when swap space gets used up.

The Windows machine, we pretty much ignored, putting up with performance so slow that the disk light stayed on all the time and wire-frames of applications that promised to open “someday” hung on the screen behind the “hourglass of death.”  However, with Tax Day only a month away and Turbo Tax freezing whenever we went to find yet another receipt or IRS form, we shelled out $100, about 1/3 of what we paid for the refurbished, off-lease box in the first place, to bump RAM from 256M to 2GB.  Hey, Windows actually loads fast enough to mellow the Curmudgeon from hating XP absolutely to merely loathing it.

Meanwhile, the nice person’s Linux box is running 64-bit on 3GB, and doesn’t even slow down when the Curmudgeon lights off yet another virtual machine in the background.  In fact, XP runs even faster in a virtual machine on this box than it does on the upgraded stand-alone machine.  Next year, we’re doing our taxes on a virtual machine.  Hmm, now that the designated Windows box runs better, just think of how snappy it will be when we reload it with FreeBSD or CentOS.

The Machine Stops: keeping up with maintenance

The title of this article is taken from a 1909 science fiction novella by E. M. Forster, better known for his novel “A Passage To India.” In the early sci-fi story, humanity has somehow ruined the environment and is forced to build underground, where automation has brought about a Utopian age where no one wants for anything.  Except, in this pampered world, society has forgotten how the technology works, and when it breaks down, “civilization” ends.  Hence, the subtitle: keeping up with maintenance, something we fear is becoming a lost art in the age of throw-away planned obsolescence, but something we need to do to survive the recession and age gracefully while still doing the things we love to do, which, at present, don’t involve being pampered and sedentary.

Today, we are on Vancouver Island, BC, Canada, planning a few days of bike riding on the popular rail-trails around Victoria, despite predictions of cold winds and occasional rain.  Our steed for the occasion is “Leviathan,” a 1986 Santana Arriva XC mountain-bike-style tandem, which first rolled onto these shores in September 1986, when we rode one of the American Lung Association’s first Tri-Island Trek fund-raising rides.  The old bike has been through a lot in the 24 years since then, countless sets of tires, new rims, several sets of chains, a set of chain rings, a freewheel cluster, several sets of pedals, and had fenders and bar-end grips added.  But, somehow, we expect it to just keep running.

Lately, we have been thinking about a new bike: “Leviathan” is heavy and slow, and, at nearly eight feet in length, not feasible to take on public transit, except ferries, without complete disassembly, and sometimes not then.     The bike still works, though we worry about touring, since most of the parts that are prone to wear out have not been manufactured for nearly 20 years and are hard to find.   On the other hand, we’re not getting any younger, ourselves.  This bike has served us well for over 20 years, are we going to get as much use out of a new one?  Probably not, since I will be 90 by the time any replacement has served that long.  There are a few folks out there, touring into their late 80s, but it’s been a hard winter for aging aches and pains and we don’t have any illusions about living fit forever.

This has always been a problem of aging: we become accustomed to the familiar.  New is no longer exciting; we take comfort in that old pair of boots, that old jacket, despite a touch of mildew here and a rip there.  Cars and bikes become old friends, and they age with us.  Just as we adapt to creaky knees, thinning and graying hair, and a few aches, we overlook the peeling paint, the window that won’t roll down all the way, or a bit of chain skip.  In retirement or planning for retirement, we become more and more reluctant to invest in new things we might not wear out, and we have less and less discretionary income to indulge in new things for the sake of slight improvements.    And, let’s face it–our modern machineries are better than the old ones:  because they aren’t meant to be overhauled, but replaced after a year or two, they are made reliable enough to run trouble-free for that “market life,”  so much so, that they will often last five to ten times longer before becoming completely unusable.

So, us old folks hang on to those obsolete items until replacing them becomes both sticker shock (a new tandem of equal quality is twice the price of the old one) and a huge learning curve from new technology (a bike we test-rode a couple years ago had brake-lever shifters, something I had never heard of and which weren’t intuitive, resulting in a less-than-enjoyable test ride).  But, those old items that have lasted so long risk sudden and final failure– since they aren’t meant to be overhauled and parts may not be available–or  gradually become impaired so they are no longer safe to use.

Just a few days ago, I was doing some practice rides on “Rocky,” my old faithful commuter bike that has carried me through several rainy seasons in Seattle and ten years of Montana winters, when I noticed the left pedal had a lot of play in it.  Being of a certain age, and a member of the “old school,” I took it apart (it clearly wasn’t meant to be disassembled–the “dust cap” had no designed-in means of removal, but I improvised), put the bearings back in the races and adjusted the cones, then took a 15-mile ride, by the end of which the pedal had nearly seized up.  I remember doing the same thing to one of Judy’s pedals when riding the San Juan Islands many years ago, but that was on the first set of pedals on “Leviathan,” and they had screw-on dust caps and easily-adjusted cones, so I was able to repair them well enough to get us home, though the races were scored and the pedals had to be replaced.   This time, I hesitated, because this bike is 14 years old, has been abused, left out in the rain, and had most everything replaced on it at least once.  I’m no longer commuting to work (the Internet is my office now), so, do I need this bike?  Do I need to replace it?   I looked through the glass at one of the new carbon-fiber road machines at an upscale bike shop in Olympia, still dreaming–40 years too late–of 22-mph time trial runs, but in the end, I ordered a new set of pedals, the third set for this bike.  It’s not fast–it’s a hard-tail, no-shocks mountain bike that I don’t ride in the mountains and it’s too slow for road riding, but I’ve ridden my “birthday miles” on it the last two years anyway, and might for many more.  A five- or ten-pound lighter bike is not going to take 30 years off the rider.

These thoughts linger with us as we prepare to head out on the trail this morning: we aren’t as strong as we were the last time through here and we have slower reaction times.  The bike is just as heavy as it once was, a couple years older than on the last long ride, and a few hundred miles further from its last tune-up.  We’ve been busy, focussed on other aspects of our lives.  Are we ready?  Will the machine stop?  Will we be capable of seeing that end coming in time to act?   Meanwhile, our 16-year-old Jeep was making protest noises while inching onto the ferry today; it, too, has its issues: cracked windshield, windows that don’t roll down, leaky weatherstripping, missing and broken door stops, peeling paint.  Maybe the bike will get us home if the car doesn’t.  If only the rusty bolts on the bike rack don’t give out on the way home.

Backups Matter

Having been a Unix systems administrator for about 20 years and “in the IT business” for more than twice that time, I have always been aware that that tedious task of backing up your data can pay off in the end.  As we say, there are two types of computer users:  those who have lost valuable data and those who will lose valuable data.  But, backups can help.

The other day, while between crises at Chaos Central, our home/office/workshop, I was browsing through the web statistics for our public sites and actually paid attention to the list of ‘404’ (File Not Found) errors.  Now, a lot of these are the usual and customary web site attacks that go on all the time, “bad guys” probing for security holes that a) don’t exist because we are not using Microsoft Internet Information Server, or b) have long since been plugged in Apache, so I more or less ignore those, along with the blind probes for informational pages to gather email data to feed SPAMmer’s address lists.  But, since I have been using Google’s Webmaster Tools to aid in Search Engine Optimization (SEO) for my clients,  I’ve paid a bit more attention to little things, like having a ‘robots.txt’ file even if I don’t have any pages I want to hide from the search spiders.  Another one is the ‘favicon.ico’ file:  you don’t need one, but browsers always ask, so it is nice to have one.  Besides, it adds a distinctive icon to the address bar, tab, or ‘favorites’ list, so I have been creating those tiny graphics and adding them to sites as well.

But, this time, I also noticed missing pages where there shouldn’t have been any.  Oh-oh.  Until sometime last year, we had a number of links from our public server pages to our ‘extranet’ server located in our home office and port-mapped to our external router.   Some issues with our Internet service at home and our impending move made it imperative to move these pages onto our public servers, which I more or less did, in bulk, to a virtual server, and changed all the links to point to the new location.  I had thought I had tested all of the links, but here we were, many months after the transition, with a broken link.  An entire page and all of its images, gone.

Meanwhile, the old extranet computer had not survived the household move.  After sitting in a cold, damp basement for a month or so last fall, it refused to start.  But, we still have the last set of backups!  So, I extracted the files from the backup archive and uploaded them to the web site.

For the record, we are, of course, a Unix shop, running mostly Solaris and Linux, so we use Amanda, the open-source backup system from the University of Maryland, and backup to virtual tape on cheap USB drives.  The drives are large enough to keep several weeks’ worth of backups, and we archive the last backup set from “retired” machines.  For machines like my Linux laptop, that aren’t necessarily on or even in the network 100% of the time, I use rsync, freezing a checkpoint now and then, since we do upgrade Ubuntu versions, and we keep a pre-upgrade backup as well as current ones.  Because the Amanda backup runs daily, but we do a lot of work during the day, we also keep a snapshot backup (using rsync) every couple of hours during the workday on our main server/workstation.  For our one seldom-used physical Windows machine, we simply copy our working directories onto a file share on one of the Unix systems.

So, there are many ways to do backups, but it is important to do at least one of them, whether or not it seems to be a pain–it will pay off.  Oh, yes, I am in the group of users who have lost data–in my case,  I archived data, then failed to make a second copy before the orginal archive failed.  And, just because I’ve lost data once, doesn’t mean I won’t again, backups or no backups.  But, keeping good and current backups does reduce the chances that it will be soon or extensive.