Hall of Mirrors: Virtualization, part 2

As Chaos Central incorporates the new virtualization server into the network, the already convoluted relationship between desktop presentations and locality is rapidly becoming a hall of mirrors.  If you’ve ever stood between facing mirrors, you’ve seen the effect of looking at a reflection of a reflection, repeated into the distance as either the reflectivity losses fades the shrinking copies of yourself to black or slight misalignment of the mirrors curves the images out of sight.  Or, you might have experienced a more elaborate hall of mirrors in a carnival, where angled mirrors show images from distant parts of a maze, or multiple images of yourself from different views.

In computer networks, server and desktop virtualization, where one physical machine appears to be many separate machines, depends on remote network connections: terminal sessions, the X Window System, web applications, or, in the case of desktop virtualization, Virtual Network Console services.  At Chaos Central, we use all of these tools, both within our own network and to maintain servers and applications at remote client sites.  But, while we’re incorporating the new server into our network, things get a little fun-house disorienting.

We’re in the middle of a two-month-long data migration project at a client site, so the laptop that has been the mainstay of the company for the past year has, in one pane, a window that shows the desktop, or console, of a virtualization host, forwarded through a secure tunnel connnection with several anchor points in the customer’s network. In that window are several terminal sessions, logged into client systems served by that remote host, and a web browser looking at the application on one of those hosts.  Confused?  Once back at Chaos Central, it gets more complicated.

In the process of reclaiming the upstairs office at Chaos Central, we’ve installed the new virtualization server on one side of the room, with it’s token text-mode console, and the old Windows XP machine with the new wide-screen monitor (a necessity, as we will see) on the larger desk.  Since the children’s electronics and paperwork still occupies the large worktable, and there are other complications with currently-running applications, we can’t move the laptop right now.  But, since the Windows machine has been converted essentially into a graphics terminal and serves as the control station for the virtualization server, the laptop is virtually moved upstairs through the magic of VNC.

So, now we have a simple Windows XP machine, which is, in itself, running only the Citrix XenServer management console, Firefox, TightVNC client, and PuTTY, an SSH terminal emulator (and, of course, the 1920×1080 widescreen monitor).  On-screen is where the hall of mirrors materializes.  Inside the Xenserver console, we can select the consoles, text or graphic at low resolution–one at a time–for each of the growing list of computer systems being built on the virtualization host.  Once built, we connect with each of them through VNC (to display the desktop), or PuTTY, or with a web application running on the virtual machine via a tab in the Firefox browser.

One of the VNC sessions on this now-cluttered screen is the on-going business running on the laptop downstairs, which has, in the six panes currently configured, multiple terminal sessions, a Google Chrome browser with 30-odd tabs open, a Firefox browser with several tabs open, a half-dozen PDF documents open in xpdf or Adobe Acrobat Reader, and, of course, the VNC session linked to that far-off virtualization host and its client virtual machines.

Of course, all of this is not about the structure of this shaky hall of mirrors, but about being connected to the data we need to do a job, wherever we are and no matter where the results are delivered.  Virtualization is not new: the original premise of the Unix system was to make it appear to each of many users that he or she was running the computer:  one computer became hundreds, in a fly-eye view.  The use of terminals connected by cables or the telephone system allowed users to be physically separated from the actual machine.

The development of graphical user interfaces allowed a person at a personal computer or graphics terminal to connect to many processes on the same machine or to processes on many separate machines.  The ability to do the latter gave way to the proliferation of special-purpose servers, each running a different application or set of applications, and using one of a variety of operating systems.  As computers got smaller, data centers continued to grow, replacing large mainframe systems with racks of specialized or redundant servers.  Space, power, and cooling requirements soared, along with the considerable investment in hardware and personnel to install, configure, and run the many systems.  Expanding the network often meant knocking out walls, or worse, moving the entire complex to a new building.  With uninterrupted data flow a business necessity, moving existing hardware was no longer an option: it became necessary to reduce the hardware footprint to permit redundancy of both hardware and data.  The first foray into accomplishing this was the introduction of blade servers, where separate processor boards shared a power supply, coupled with storage networks, where disk space became separate from the processors.

Still, adding capability required adding more hardware: hardware that operating logs showed was largely idle, waiting for requests from users, or in standby to take over in case of failure.  The way to solve this was virtualization, where complete server software and operating system images could be shared by a single hardware cluster, balanced for optimal loading, and distributed across minimally-redundant processing and storage capability.

All of this evolutionary and revolutionary design provides continuous business data availability, if done well.  For the IT department, it represents a continually changing environment, moving and replicating systems behind the scenes, while business goes on.  A bit less stressful than the career-breaking and empire-toppling outages possible in the days of non-redundant large systems, but unheralded and unnoticed, if everything goes well.

For the developer and consultant, virtualization means having many different computer systems at one’s disposal, with only one major hardware investment.  But, the migration is still labor-intensive, and results in a screen with hundreds of nested windows, where the depth of redirection can sometimes be detected only in the time delay between hand movements and movements on the screen, where network switching delays and redirection destroy the illusion of locality.

Virtual(ization) Reality

When Chaos Central was centered in Montana, we used to say our house was heated by computers.  This was partially true: when we retired our hand-me-down backup server, a huge Dell 2400-series server, our electric bill dropped by $5 per month, and the temperature in the garage workshop (aka the world’s smallest aircraft factory) dropped by 5 Fahrenheit degrees.  We didn’t have virtualization then, but we did have a single-board computer installed in one of our SPARC Solaris machines that ran Windows.

Over the years, we have had separate machines for Windows, Linux (SuSE, Red Hat, Ubuntu), FreeBSD, and Solaris, usually at least six running at any given time, plus at least one laptop for road trips, rigged for dual boot with Windows and Linux.  In our move to Washington State, the FreeBSD machine and one of the two Solaris/SPARC machines did not survive a couple months storage, so we have been down to four systems: the remaining Solaris/SPARC development machine, a Windows XP machine for running TurboTax, a Ubuntu Linux desktop for the Nice Person’s fabric arts business, and the Unix Curmudgeon relegated to a Ubuntu Linux laptop for everything else.

Having a laptop as primary computer is a good idea for frequent travelers, but the average laptop just doesn’t have the heavy lifting capability to run a Unix consultancy. Plus, we were traveling often enough that taking turns at the keyboard just wasn’t fitting our frantic schedule, so we added a netbook (running Ubuntu, of course) to the mix, which gets used mostly “on the road.”   Yet, the laptop ends up with too much (we’ve maxed the memory and disk capabilities), including things we don’t use on the road, and too little, because we are restricted to one system.

Meanwhile, the Unix Curmudgeon encroached on the Nice Person’s big Linux machine, installing VirtualBox with virtual machines with FreeBSD, CentOS, Fedora Linux, and Windows XP.  We’ve been running our finances with Quicken under Crossover Office or Wine on Linux for a long time, but some of the features in our 2002 edition weren’t available.  Just changing the company address on our invoices took exporting and hacking the QIF file and re-importing it, and we’d had a few scary recovery moments.   So, we upgraded to Quicken 2010, which, like many products now, is essentially incomplete in itself and requires Genuine Microsoft Windows to run.  This led to yet another instance of Windows XP (yes, we have licenses for these, along with the OEM disks, from prior systems, of which all of the hardware except the side panels with the activation code has gone to the recycle center) in virtualization.

Since the desktop Linux machine runs best with only one virtual machine at a time, it was time to rethink the computer inventory at Chaos Central.  For the past nine months, Chaos Central has been a multi-generational household, with children and grandchildren stashed in “spare” bedrooms, and the Unix Curmudgeon’s office taken over by the children for their own home office, leaving Information Engineering Services as a virtual company, operating from a laptop wedged between the floor looms in the Realizations Fabric Arts weaving studio/office/sewing room in the basement.   But, with the possibility of the nest being emptied (again) soon, the Unix Curmudgeon has stealthily begun to retake a bit of real estate in the upstairs office.

In the interest of efficiency and performance, we added a new Dell server to the stack, one several generations removed from the old Dell 2400 retired in 2008, and a considerably smaller footprint.  Now, Sun VirtualBox is an excellent virtualization system–for a workstation–but, with Oracle systematically dismantling many of Sun’s former offerings, and the need to provide network-wide services, we decided to go with a Xen-based system.  Now, it is simple enough to provision virtual machines on any modern Linux, and nothing we haven’t already done for clients, but it does take a bit of skill in configuration.  A number of companies, like Citrix and VMware, have packaged virtualization solutions that run as the primary operating system on the virtualization host.  We selected Citrix XenServer as the candidate platform, if for no other reason than, because the platform foundation is a stripped-down CentOS distribution with the Xen management utility replaced by a custom Citrix application (which, inexplicably, runs only on Windows), it has a stripped-down, bare-bones free version.  Unlike the VMware limited 60-day free trial version, Citrix has no time limit, so evaluation can be more leisurely, and the limited feature set is probably adequate in the long term for a small consultancy that doesn’t have to manage dozens of VMs or worry about high-availability solutions.

The problem with adding a server-class machine to a small network is that a server is designed to be just that–a workhorse machine that delivers over the network, so most have robust disk controllers and high-performance disks, and high-speed networks, but poor or non-existent graphics capabilities.  We took the opportunity to acquire a large-format monitor (1920×1080) with our purchase, but the video capabilities of the server are way too modest to drive it (i.e., fixed video memory, no AGP).  But, because we do have the old obligatory, token Windows machine hanging around (which also runs Ubuntu in a Wubi installation), and we have to use it for the Xen management console anyway, the new monitor works quite well on the old machine as a simple graphics terminal.  The connections with the virtual machines will be via VNC anyway, so it isn’t necessary to have a graphics console on the server itself.  We dragged a surviving 14-inch monitor (which once served as part of a home-built “portable desktop” mini-ITX system) out of storage to serve as the server console monitor.

Building the virtual systems promises to be quite an adventure in itself.  A common use for virtualization in the Enterprise is to consolidate Windows servers that used to be relegated to separate machines, primarily due to the memory-management limitations of 32-bit Windows, or to provide Linux  in virtual private servers for web clients, where the virtual machines are essentially all built from similar templates.  In our case, we need to provide multiple Linux distributions, along with one or more versions of Windows, for software testing and development for Intel and AMD systems.  The adventure begins now.  Stay tuned for progress reports.

Labor Day

Labor Day is just another workday when you work from home…

Here we are, on the first Labor Day weekend since our “non-retirement” nearly a year ago, also the first anniversary of unloading at least some of our “stuff” at our new home and close to the second anniversary of the official start of the Great Recession.  It is becoming increasingly apparent that the era of the traditional “retirement” that folks born before the Great Depression have enjoyed is over.  Those of use between the Greatest Generation and the Boomers are in a kind of limbo where expectations and reality are diverging rapidly.

A number of factors weigh in on this sea change in our perception of the “Golden Years” as we approach the second decade in the new century.  First, many of us in our seventh decade of life are much more active and healthy than were our parent’s generation.  In the mid-twentieth century, retirement age was very close to the average life expectancy of those entering the workforce.  Statistically, only half of those entering the workforce would survive to see retirement, and those who did were not expected to be productive workers by that time, and live in retirement only a few years.  Today, the average life expectancy of someone entering the workforce as a young adult is in the mid-70s.  Those of us already at retirement age can expect to live another 16 years–on average!

This is no surprise, as the life expectancy has been increasing steadily for many decades, while the retirement age, set in the 1930s, remained the same until recently and is only gradually rising.  “Retirement” is no longer an extended vacation or extremely long weekend of household puttering: few today, raised in the age of easy credit and ravaged savings, have the finances to travel and play or indulge in expensive hobbies for decades, and, well, not all of us enjoy home repairs and gardening all that much to devote all our waking hours to it.  So, retirement becomes a new career opportunity, either to seek a new career or transform our old one.

In the case of the Unix Curmudgeon, our life hasn’t changed much so far, except the office is downstairs instead of across town, but we’ve been in that situation before, in the mid-1990s; the hours are a bit more flexible, but the result is that just as much work gets done, for less money and stretched across longer days.   Tasking is still driven by email and the intensity (and volume) of work varies proportionally with the urgency of server log messages.

“Working from Home” is the flip side of   “FAXing from the Beach,” the topic of an earlier post.  While the home worker has increased personal freedom, he or she is essentially immersed in work, 24x7x365. Still, this arrangement has its merits:  work gets done, but on a personal time scale.  The world of business isn’t quite ready for this transition, though.  The modern office culture, developed when communications was by memo rushed from desk to desk, is archaic when documents and messages can be instantly transmitted around the world, but is ingrained in corporate policies and procedures and the mindset that unseen workers are unsupervised workers.  Granted, telecommuting doesn’t scale well to many jobs, though it is surprising how many it does cover well.

System administration and programming are nearly ideal candidates for remote work, except for those rare times when a certain amount of “laying on of hands” is necessary, like installing, removing, or repairing hardware.  Installation and removal can be largely scheduled well in advance.  The computers I tend most of the time are physically 600 miles from my home-based office, sometimes further if I am traveling: I have made in-person appearances several times over the past year, sometimes just to remind the community-at-large that I still do work there, and sometimes to man-handle the hardware.

My last trip to the client’s site involved shutting down and removing an old server, a more or less symbolic act, since I had, over the previous few weeks, migrated the functionality to other machines and even changed the identity of the machine on the network.  The other part of the equation, repair, becomes almost a non-issue with the proliferation of virtualization and high-availability clustering.  As long as there are enough hosts to handle all of the services needed, the virtual server images can be moved from one to the other and computing loads balanced among the remaining servers in a cluster, almost behind the scenes, so that even repairs can be scheduled at a convenient time.

So, this Labor Day, as I monitor the migration of hundreds of thousands of files to consolidate several older, smaller servers’ data into one, and draft a quote for the next year’s contract with my primary client, I realize that retirement is a relic of the past: the private and public pensions available to persons of a certain age only serve to supplement the uncertain income of hourly contract fees and permit a slightly more relaxed work schedule (though when projects dictate, the workday sometimes runs well over eight hours and through the weekend).  Labor Day is a celebration of continuing to be a productive member of the workforce, at an age that previous generations of workers were expected to cash in and step aside.  That was a noble idea when social security was invented, to make room for younger, more able workers in an age when unemployment was more severe than it is today.  But, today, skill and experience counts, too, and age is not a factor for knowledge workers, as long as their minds are agile and accepting of new ideas.  We’re used to that–in the 45 years since we’ve been in this business, the only constant has been the rate of change, governed by Moore’s Law.  Ever increasing speed and capacity opens new avenues for change.  Some of those changes have been waiting years to be practical to implement: for instance, microprogramming, the principle on which most modern computer architectures is based, was invented in the early 1950s, but had to wait until the development of the large-scale integrated circuit chips in the late 1970s to become realizable on a practical basis.  Multi-processors and distributed clusters, once reserved only for vital scientific and military purposes, are now affordable to everyone, but only a few of us have long experience with programming and using them.

The exuberance and curiousity of youth is a great boon to business, but so is the calculated patience of us elders.  The curve of Moore’s Law is flattening–the secret to speed and capacity in the future lays in the techniques of the past, by which we wrung performance out of those early, limited systems, with slow processors and small memories.  Old is useless only if what it does isn’t needed anymore (if it ever was) or superseded by a paradigm shift.  Who knows paradigm shift better than those of us who have lived through the entire history of computing?