Why Snow Paralyzes Puget Sound

Note:  This was written on battery, in the dark, during the obligatory post-snowfall power outage at Chaos Central.

The folks who live on the Great Plains and along the Great Lakes snicker a bit at news reports on the mass closures in Seattle when snow comes to the the Puget Sound region. Of course, the standard excuse is, “It doesn’t snow that much here, so we don’t have enough snow removal equipment.” But, wait, there’s more…

Most of us live in the Pacific Northwest because of the geography and the climate. The climate is temperate, with few days below freezing in the winter and few days in the 90s in the summer. Most of us live within 20 miles of either salt water or a fresh-water lake. If you want snow, you can find it in the nearby Olympic or Cascade mountain ranges, 12 months out of the year. The rest of the year, the lush green mountains invite backpackers, the flat river valleys attract bicyclists, lakes and estuaries fill with kayakers and canoeists, rivers with rafters, and the larger lakes and the Salish Sea (the accepted name for the inland sea that stretches from Olympia, Washington to Prince Rupert, BC, bounded by the Olympic Peninsula and Vancouver Island) fills with sailboats and motor craft of all sizes.

For all of this, we are willing to put up with the moisture that is a natural consequence of air warmed by the Pacific ocean currents flowing up the coast from the south. Rainwear is a fashion statement, and umbrellas are considered a mostly ineffective and cumbersome nuisance in a land where the average day ranges between swirling mist that clings to everything and fire-hose-like sideways blasts with downpours so heavy that motorists have to brake for migrating salmon. And, for a few days every odd year or so, cold and wet coincide in a heavy snowfall like those that come in waves winter-long in places like Minneapolis and Chicago. But, this is not the midwest.

Double-digit grades and ice do not mix: the street in front of Chaos Central

First, because of the geography: the region is characterized by foothills of the mountain ranges that frame the inland sea. The foothills give way to steep bluffs of glacial till that fall off sharply into the river valleys and sea, up which rise city grids on double-digit grades more suited to Olympic ski jump ramps than walking, driving, or pedaling. And second, because of the climate: the temperate nature of the climate dictates that, when the temperature does dip to the low end of the range, it hovers back and forth across the freezing line. The mass of the inland sea also helps moderate the temperature: when the wind is calm, the snow level is measured in not less than a couple hundred feet above sea level. But, the wind is never that calm, which means the rain that falls on the lower elevations, itself barely melted snow, falls on frozen snow, and therein lies the root of the problem.

Our camellia bends over under the weight of clear ice on top of snow

Steep slopes plus ice plus a preponderance of evergreen foilage combine for interesting times. The foilage includes firs, pine, and cedar, which are better-adapted to snowy conditions but less so to strong winds, since the wet climate promotes shallow root structures. Other temperate evergreens include broad-leaf plants like the native rhododendron, holly, and other ornamental shrubs found in cities. Deciduous trees in cities tend to become infested with English ivy and other climbing evergreen vines, which make them vulnerable to wind and snowload as well.

So, in the aftermath of one of these snow episodes, the snow plowing efforts, inadequate to begin with, leave many streets unplowed and packed down by adventuresome traffic. Trees are heavy with snow. And then, it rains. At first, since snow is an insulator, the rain simply combines with the snow to form a heavy crust of ice on top of a fluffy layer of loosely-packed flakes. This results in two different effects: first, the leaves and needles of the trees hold the snowballs in place until they become ice balls, at which point the tree either breaks apart or falls over, usually on a power line. Buildings designed for “normal” snow loads may collapse as well as the ice builds up. Second, on steep slopes where there are no trees (because this happens often enough to discourage their growth), the weight of the ice overcomes the integrity of the fluff beneath, and the whole mass heads downhill, picking up speed and mass as it goes, sweeping everything and everyone before it, often ending up in a solid mass of lumpy ice and boulders across a major highway.

While major power transmission lines have usually been built well clear of historic avalanche zones, the metropolitan and rural power grids are not so fortunate, being in the midst of so many snow and ice-laden trees: power outages soon follow, lasting anywhere from several hours to several days, depending on the distance from the substation to the consumer. Getting to the site of the outage to repair the damage or getting to a location where there is power is also problematic. Where the roads have been plowed and walkways shoveled, the rain freezes directly on the surface, creating a virtually frictionless surface not suitable for either walking or driving. New power outages crop up where cars skid off icy roads into power poles or trees.

Meanwhile, the temperature continues to rise, and the snow level moves higher, leaving behind a slushy mixture that begins to be more water than ice. The water moves downhill: the ice holds it back. Water backs up into basements and garages and fills dips in streets and roadways. Then, it all ends up suddenly in streams and rivers, which quickly overflow onto low-lying areas that may have been spared the heavy snow and ice loads but now find themselves underwater.

If the snow load was large enough and the rains continue, the clay bluffs continue to absorb water until they, too, become too heavy and slippery, and the hillsides collapse into the bays, taking trees, houses, and roads with them. Fortunately, the combination of wet and cold winters happen only once a decade or two in this area, so people rebuild and forget the troubles by the next sunny day when the “mountains are out” and they head for the beaches, marinas, backroads, or trails.

Teaching New Dogs Old Tricks: Ubuntu for Unix hacks

Here at Chaos Central, we’ve been wildly excited about our new crop of Ubuntu 11.10 Linux computers from Zareason, the arrival of which was discussed here a couple of weeks ago.  But, thrilled as we are with the speed and capacity of these boxes, there is a learning curve–for the box, not the user.  Unix, and, by extension, Linux, has been evolving for more than 40 years now, and has a huge library of Useful Things accumulated, and a history of competing distributions, each with its own “flavor” and set of favorite tools.

In the beginning, Unix was a toolbox for scientists and engineers to build things quickly and cheaply: initially, document processors, and utility software, and, later, the Internet itself.  The growing popularity of Linux, and particularly the Ubuntu distribution, has driven the need to make it useful for the things “ordinary” people–i.e., those who don’t make a living from tweaking the innards of Big Iron computing–like to do, like surf the net and manage their music, video, and image collections.  And, since Linux is still what to do with your old PC when it gets too bloated with malware and  spyware, the popular distributions still need to fit on a CD.

Most industrial-strength distros now come on DVDS, sometimes more than one, containing the entire Linux collection of free software. but, for the masses, armed with CD-only PCs, something has to give, which often are the venerable, legacy Unix features that most home users will never need.  But, many of those are what we’ve been lugging around on our hard drives for 20 years or more:  venerable editors like emacs and vi, superceded by menu-driven simple editors or integrated graphical development environments but still more powerful and with more features than anyone will ever use, and for which the ones I’ve learned, are extremely useful and well-practiced enough to be second nature, so I keep using them, even if they don’t come with the system anymore.  And, more recently, enterprise-level tools for building massive compute clusters, like GridEngine and MPICH2, along with software development libraries and specialized utility libraries for science and engineering.  We also need a lot more development tools than come with a standard desktop, since we develop software for the web and for high-performance computing clusters.

Fortunately, the Ubuntu software repositories have a lot of those tools packaged up and loadable from the Software Center application, so we don’t need to go through the ritual of downloading, unpacking, configuring, compling, and installing nested sets of dependent programs like the old days.  But, our old computers have accumulated a unique set of software over the four or five years of their busy lives, so the new ones have a lot to learn:  we first had to load the no-longer-included Synaptic Package Manager to grab some of the software libraries and utilities not available in the Software Center catalog.  And, of course, get rid of that silly Unity desktop that only works well for folks who only do one thing at a time with their computers.  We have to have lots of toolbars visible and lots of  workspaces to which we can jump with a single click, which Gnome gives us.

Not surprisingly, some of the more esoteric and least-used packaged software still have a few surprise unresolved dependency issues.  To my delight, GridEngine, a distributed job control system for compute clusters created by Sun Microsystems a dozen or more years ago, was available in the Software Center.  Since Oracle bought Sun a couple years ago, a lot of these tools have disappeared off the free download list at Oracle, folded back into the supported product lines, and the old packages are sometmes hard to find.

GridEngine is one of those transitional systems that, unlike the new applications designed to run under Gnome or KDE desktop management systems, was born and developed during the days of OpenWindows (contemporary with Microsoft Windows 2) and the Common Desktop Environment (CDE, which predates Windows 95), both high-end X-Window System desktop managers in their day.  X11 programs used to be a lot harder to write, and designed for networking more than having the client and server on the same workstation, so they tended to leverage the Unix philosophy of lots of little programs, each doing one thing well, working together, much more than the more integrated and abstracted desktop applications of today.

GridEngine is more likely to be installed on an Ubuntu machine as a client or, at most, an execution node in an ad hoc cluster, rather than a master host, so would not usually run the graphical grid manager, qmon.  But, Open Source being what it is, the whole package is available.  However, the “just works” philosophy of Ubuntu breaks down here, as the dependencies of the archaic and arcane OpenWindows flavor of the graphical component aren’t checked very thoroughly, and there is a bit of a problem.  The application depends on the X11 font server, a client-server application designed to facilitate running X11 clients on a server and displaying them on a different X11 server that might not have all of the requisite fonts loaded.  Also, because CDE relied heavily on licensed Adobe Truetype fonts, the chain of dependency gets broken when it comes to fitting old non-GPL’d software into a Linux distribution.

When you get a GNU/Linux system distribution, everything in it is licensed under the GNU Public License.  You can install anything you want in addition to that, but you can’t package the extras and redistribute it  This also extends to the packaging system.  The Ubuntu Software Center has provisions for adding non-free (i.e., non-GPL) software repositories, but they aren’t going to intereact with each other, so complex packages like GridEngine, that depend on non-free components, come with “some assembly required.”  In my case, someone had already solved the problem for Ubuntu, so a Google search turned up a list of the missing packages and how to install them.  I’ve used GridEngine for years, but on Solaris and RedHat Linux systems.  Solaris, of course, was licensed from Sun (now Oracle) and had full support.  The old Sun GridEngine for Linux packages came with the non-free fonts and dependent packages integrated, because you got them from Sun–they weren’t on any of the five or six CDs (now two DVDs) that comprise the full Red Hat Enterprise Linux (or, as many of us who don’t need any hand-holding support from Red Hat use–CentOS).

So, at Chaos Central, the new dogs are gradually getting housebroken and have largely quit chewing on the furniture, i.e, have learned enough so they–to use another analogy, don’t respond–like HAL9000 from “2001: A Space Odyssey,” “I’m afraid I can’t do that…” when asked to perform tasks the other computers have been doing for years.  They’ve even learned, with larger memory and faster processors, to do new things.

Upgrade Challenges: Avoiding the “Microsoft Tax” and Buying American

The signs that the Great Recession is receding can be found in the return of a tradition taken from the pages of the Christian scripture, the Christmas Shopping Frenzy.  We’ve never understood how the one-paragraph note in the Gospel of Matthew–that describes the arrival of the Magi bringing gifts to the infant Jesus–has created, two thousand years down the vortex of time, a world-wide phenomena, a seasonal gifting orgy of conspicuous consumption that transcends and even obscures the religious symbology of the gifts.  Not to mention that the original story ended badly: the local political power (Herod) subsequently engaged in an horrific massacre of male infants in an attempt to eliminate a perceived threat of competition from the giftee, while the gift-givers fled to avoid interrogation and the target of the pogram was whisked  away to a foreign country.

Here at Chaos Central, we observe the Christmas tradition in a more subdued manner: a small celebration with close family who are practicing Christians, with small gifts, some hand-made, and exchanging end-of-the-year greetings with friends.  And, since, by coincidence, the holiday does happen at the end of the calendar (and tax) year, we do evaluate our balance sheet and make a few last-minute gifts to charities, as well as gifting ourselves with any tax deductible business purchases that were on the near-term planning cycle.  A truly secular side to the “shopping season.”

In this topsy-turvy economy, where millions are unemployed but there is no shortage of merchandise made “off-shore,” and the number two and three brands sell well only because the number one brand sold out during the enforced shopping frenzy, it makes sense to consider making small changes to how we buy things, to reverse the economic trends that have brought us to the brink.  Competition is healthy: the existence of near-monopolies stifles innovation, despite the fact that innovation may have created the monopoly in the first place.  That’s one reason we put off buying computers: There are simply too few alternate choices.  It is much too late to “Buy American,” because very little manufacturing is done in this country anymore.  Almost all computers, whether running Microsoft Windows or Apple OS/X, are manufactured offshore.  And, those are the only choices of operating systems in the big-box stores. But, we can at least buy things assembled in America, and with a choice of open operating systems, if we look for them.

In our case, it is time to upgrade our aging stable of computers, reassigning an 8-year-old workstation now suitable only as a graphics terminal for virtual machines (and not very good at that), and a 4-year-old laptop with limited memory and disk size.  We need a laptop suitable for hosting multiple virtual machines and a full multi-media desktop to adequately support our business projects. These are overdue, postponed while we weathered the deepest and most personal effects of the Recession.  Things fall apart.  Not literally, but the thread of progress gets unraveled when there is no growth due to lack of capital.  [Lest we get confused, capital is not profit–what is wrong with much of America is the failure to invest in capital, in an attempt to keep up the appearance of profit during downturns.]  Computers last upwards of ten years or more if properly maintained (we have one, a Sun workstation, still running after ten years, and 15 and 20-year-old “retired” computers that will still boot up), but they are not cost-effective after four or five years, as they simply cannot do the work to remain competitive against newer systems, and are often not able to run newer software efficiently, if at all.

For some years now, adding a new computer running Unix or Linux to the network at Chaos Central has usually involved buying an off-the-shelf machine and stripping off the unwanted default operating software–i.e., the currently shipping version of Microsoft Windows–or ordering piece parts and building a bare machine from scratch, which is possible for desktop machines but difficult in the case of laptops.  In both strategies, the machine is not ready for use until a suitable replacement environment has been installed. Server-class machines can be (and have been) ordered with no installed operating environment, but the choice of portable systems and compatible desktop workstations has been limited to systems manufactured by Apple, running the OS/X operating system, or a wide variety of other Intel and AMD-based machines–all running Microsoft.  While OS/X is a variant of Unix (the Darwin microkernel port of BSD),  GNU/Linux, Oracle Solaris, and FreeBSD are more commonly compatible with the server systems that we administer and program for clients, so those are what we want on our desk and in our luggage.

Fortunately, there is a large enough market, 20 years into the GNU/Linux revolution, so a number of enterprises have sprung up to build and sell systems that run Linux “out of the box.”  A few major manufacturers, like Dell, did offer Linux choices at one time, but for various reasons–too small a segment of the commodity desktop/laptop business at that time to diversify software choices; and/or problems with Microsoft OEM licensing agreements that applied to product lines rather than individual machines–they dropped the offerings, except for the much smaller and more customized server product lines, in which case they only sell licenses and media: installation and configuration is left up to the buyer.

Small-footprint desktop, preloaded with Ubuntu

However, near west coast port cities, like Seattle, San Francisco, and Long Beach, the ready availability of computer piece parts in economical small lots from tier 1 importers makes it possible for small businesses to build custom non-Microsoft computer systems at nearly-competitive prices.  As it turns out, the market for Linux workstations overlaps with the market for high-end video game machines–with powerful graphics, multi-core processors, and lots of memory–so there is a plentiful supply of components, most of which aren’t found in commodity desktop machines anyway, so the price difference is well within reason.

We like a bargain as well as anyone else, but, as a small home-based business ourselves, we prefer paying a little more, knowing that that extra is providing a living wage to fellow entrepreneurs and folks who love what they do, not boosting the portfolio of an executive as a bonus for outsourcing the entire product and support pipeline to southeast Asia.  We bought our new machines, a high-end laptop and a workstation powerful enough to serve multimedia applications, from Zareason, a small company whose owners we met at Linuxfest Northwest last spring, where we got to check out their offerings.

Buying locally-assembled products isn’t bringing back “Made in America” factories, but it’s a start toward turning a nation of consumers into a nation of producers who take pride in what they make with their own hands and minds.    We’ve written here a lot, recently, about our adventures on our “Made in Oregon” tandem bicycle, and we’re now configuring the next generation of Linux computers, “Made in California.”

Restoring Chaos, Post-Tour: Settling in for the Winter

After our impromptu grand tour of the U.S., through 19 states by car and bicycle, we are home at Chaos Central at last. Our 30-day adventure evolved when the planned fall bicycle tour of Upper Michigan got derailed in September, and just simply overtook our lives for several months. We had returned home at the end of September, having traveled over 6000 miles in our loop from Washington to Wisconsin to California, with only 30 days to plan our next trip.

Faced with a longer, more difficult bike trip, we augmented our bicycle camping equipment and actually managed to work in an overnight bicycle camping trip to test our gear and our fitness, after having not ridden for a month. A week or so before leaving, we had hosted a succession of late-season bicycle tourists headed south on the Pacific Coast bike trail, and had ridden with the last group on a final 30-mile “training ride,” not exactly a proper training and conditioning regimen for a planned 400-mile fully-loaded tour.

One of the factors affecting our pre-tour training was the need to catch up on work projects largely shelved for our September excursion. The other was the onset of cool, wet weather in the Pacific Northwest. When we finally left on the final stage of Tour 2011, headed east through Montana before angling southwest toward Florida, morning frost followed us as far as St. Louis. Finding ourselves riding 60 miles a day in 80-degree heat and searing sun was, quite frankly, a shock. That, combined with road conditions unsuitable to our equipment, led to modifying our tour to suit our capabilities, in mid-tour, well documented in earlier posts, as was the long trip home across I-10, I-8, and I-5.

When we at last arrived back at Chaos Central, we found our network down, from a power outage incurred several days earlier during the first winter storm of the season. Getting the services back on line was necessary before resuming work projects. While we were gone, our landscape contractor had been busy, with most of the major digging, rock work, and large plantings done. But, much detail remained, taking a couple of weeks of distraction. A scheduled servicing of our now-not-so-new Jeep Patriot showed that the 35,000 miles we had driven since January, mostly at freeway speeds, had worn out the factory tires. Indeed, the noise level in the car had grown steadily on the trip home, as the wear bands moved closer to the road.

In the midst of the emergent work, which included some software issues on one customer’s server, email issues with another, and web updates and issues for several others, we began to unpack, finding places to stow our newly-revised camping gear complement, and performing cleanup, tuning, and reassembly of the Bike Friday “Q.” The rain fly on the tent was a bit musty, having been packed since that last dew-soaked morning on the shores of Lake Okeechobee nearly two weeks before, but aired out fine and was repacked. The bike did not seem to suffer as much as we feared, after its dowsing with brackish water on the ill-maintained bike trails in South Florida. Application of an aircraft anti-corrosion treatment (ACF-50) on the trailer parts and some of the lower frame parts seemed to take care of visible effects. But, the real shock was the discovery of a broken spoke on the rear wheel: this had undoubtedly occurred during our mad dash down the Overseas Highway which had resulted in three flat tires, and loosened the headset and the trailer hitch. I had probably ridden the 76-mile day on it, but hadn’t noticed any handling problems, since the wheel is built for tandem loads.

Another casualty of the Marathon Gauntlet: a broken spoke on the rear wheel.

Fortunately, the broken spoke was on the left side, so could be replaced without removing the freewheel cassette. Bike Friday does include two spare spokes of each of the three sizes needed. But, we did not have the freewheel removal tool in our kit, an omission we quickly remedied with a stop at REI on our next trip into the Big City, along with other supplies to finish the post-tour grooming of the bike, something that is progressing slowly.

Work still hasn’t caught up: a couple of projects have yet to be restarted; planned end-of-the-year computer upgrades have not yet been ordered. When the bike is finally assembled, we need to construct indoor window inserts for the winter, an alternative to simply taping plastic to the old single-pane windows at Chaos Central. The lumber and plastic film awaits. All we need is time…

End of a hard season: salt-crusted bike cap

Tour Diaries — End of the Road: There’s No Place Like Home

An early departure from Redding brought us by dawn to the southern border of the mythical state of Jefferson, that slice of the Pacific Northwest that encompasses the mountainous area between the Sacramento and Willamette Valleys.

Sunrise on Mount Shasta

Lunch today is Thanksgiving leftovers at roadside stops, and we quickly find ourselves through Oregon and back into our home state of Washington, in time to pick up our cat from “Just Cats” halfway between Olympia and Shelton, arriving at home just before dark, unloading the car from our long 30-day, 8,000-mile circumnavigation of the country. In the night, the rain starts, a soft, steady cascade that continues though the next day. We are home at last.

Afterword:

Several days before we arrived home, the first harbingers of Northwest winter had knocked out power long enough to drain the server batteries at Chaos Central, so we spent a bit of time getting all of the network services back on line, particularly those that require manual initiation after reboot. The last few days of our travels, not being able to “call home” and using our tunneled, encrypted proxy for Internet browsing safety, we relied instead on the Tor anonymizer service, which, though designed for a different goal, is, for our purpose, also an encrypted remote proxy that protects from packet snooping and firesheep attacks as well as disguising the location. Our dedicated proxy server makes all our browsing appear to come from our home location, while Tor packets are scattered among many locations. Both ideas are useful for working through firewalls and for maintaining security in untrusted public networks. At hotels and coffee shops requiring a web-based login, we use a separate browser for the network login, then do all our other browsing through the proxy or Tor network.

Hey, all this ‘Net jargon must mean we’re back at work. But, for 30 days, we suspended time and lived in another set of problems. If that’s what a vacation is, then we must have had one, the longest ever.

Musings on Unix, Bicycling, Quilting, Weaving, Old Houses, and other diversions

%d bloggers like this: