The Internet went down today, at least at our house, and at an unknown number of other houses on our street, along with their TV feed (we don’t have TV). But, we know about the others only because we have a smart phone. I managed to keep the wifi turned off long enough to login to Comcast via the cellular network for a system status update. This may seem the height of absurdity, to need to access the Internet to find out why the Internet is down, but that is the future to which we have come. We used to have phone service on cable, too, which would have left us totally deaf and blind, but with cell phones, it is possible to call tech support. Except, we use the Internet to find phone numbers. I’m not sure we have a paper contract or information packet that has the support number. At any rate, the Internet has also resulted in the depletion of the help desk: it is much more efficient to have the computers check your connection status than to explain your location and account number to a person (after waiting a very long time in queue), then phrase the question properly. The web app checks our Internet connection (yes, it is down), and then announces “an outage has been reported in your area.”
Sitting in the office without Internet is a bit like sitting in the house with a general power outage. We still have lights, and computers, but–as I am doing now–we have to write to local files instead of interacting with the blog server out in the “cloud” somewhere, for later upload, a bit like reading by candlelight. There was a time, 20 years ago, when we actually composed email on our computer, after which the computer would initiate a call on the modem to contact the next server up the chain to send the mail and receive any waiting incoming mail. A few of our friends who live beyond cable and fiber still use dial-up, but the sound of a modem negotiating a connection is as rare as the clop-clop sound of horse-drawn carriages on Main Street.
So, as we wait to get reconnected with the day’s crop of cute cat videos, we can reflect a bit on not only how far we’ve come, but how far we have to go. The next wave, of course, is to get completely unwired, with community high-speed broadband wifi, affordable cellular networks, and wearable, always-connected computing. I’m not sure about the public in general, but for us, traditional television is dead–we haven’t had a TV for at least seven years. The future is in Internet services like Netflix: movies on demand, news stories on demand, and some mix of live streaming feed, as we already have with the major news services and Net-centric services. A high-speed cellular network can (but probably won’t) remove the single point of failure that the “last mile” wired connection represents. With the arrival of ubiquitous networking comes the newest tablet system, running Google OS, where the device only supplies a display and connection to processing and data storage hosted in “the cloud,” which exists as a distributed network of huge data centers scattered across the world. Without a network connection, the device is as useful as an unwound pocket watch.
Which brings us to another point: with cell phones constantly connected to the phone network, we have no need to wear or carry timepieces anymore: a generation of plastic LCD or LED wrist watches have become junk, and the mechanical watches and clocks of an earlier age have become quaint pieces of animated jewelry. To wear such jewelry, or other ornamentation made from the dissected parts, identifies one as part of the steampunk movement, a re-imagining of a future where the workings of civilization are visible and can be tinkered with, where function merges with style, as in the hand-wrought brass and filigreed cast iron implements and open-frame steamworks of the early industrial age. The computer age has passed so quickly from a vast tangle of wires and visible circuits to slick slabs of glass with microscopic complexity embedded within, that the magic has turned from white to black: one can no longer understand the machine by simply observing it operate.
Which brings us to the obvious: the only constant in the last half-century has been the rate of change. We constantly must adapt to new ways of doing old things and getting used to doing things we didn’t imagine a few years ago (even if we are avid science-fiction fans: sci-fi is always a comment on extremes taken to their logical (or illogical) conclusion, while reality takes a turn away from extremes, often in a completely different direction).
So, now well into our fifth year of non-retirement, we keep moving forward, not only exploring new activities associated with actual retirement, such as more frequent travel and taking up new hobbies (often at the expense of old ones), but also keeping up with the state of the art in our chosen profession. In the last few weeks, we have set up a virtual server to explore the concept of containers, a not-new, but relatively undeveloped way to isolate different services hosted on the same machine. What makes this attractive now is the emergence of Docker, which is a nascent container management system, making it easy to build, administer, and distribute containers focused on a single application. As with all emerging technology, it is still brittle and requires specific hosting configurations, but it is a very promising approach to a new way of distributing and hosting Linux applications.
At the same time, we are learning to use Git, a fifth-generation software version control system, and have set up a git server in our office network. We’ve used version control systems since graduate school in the 1980s, first with the original SCCS (Source Code Control System), then with the simpler and excellent RCS (Revision Control System), which we admittedly still use for local version management when developing and administering systems, brief encounters with CVS (Concurrent Versions System), which introduced client-server modes as Unix moved from a single mainframe with terminals to a network of servers and workstations, and, fleetingly, with SVN (SubVersioN, a major remake of CVS). Git, by virtue of being the tool of choice for Linux kernel development, has become the new standard. It also has the advantage of using a snapshot model of the project space. Each of these has progressively moved from a simple difference model in a single directory on a single machine, building on the common tools of Unix and network protocols to make possible collaborative development on a world-wide scale. Of course, these networked tools beg to be hosted on repositories “in the cloud,” which requires an Internet connection to fetch and update files in collaborative projects.
So, there it is: we have become dependent on the Internet in much the same as we have become dependent on electricity, the telephone, and the internal combustion engine. At the same time, we have become distanced from the technology of the Internet: everyone uses it, but few can actually make it work. Not everyone needs to, but it is still a good idea to understand the principles on which it is based–the fundamentals of programming and design. Not only does learning to program enable one to understand how the Internet works on in internal level, but the process teaches one to partition tasks, organize procedures, and recognize relationships in data, essential for many aspects of life in general.
Like on power-outage nights, we retire early, rising well before the late winter dawn to find—oh, look, a new episode of “Simon’s Cat” on Facebook. Yes, the Internet is back.