Road Show – Traveling With Computers

The Unix curmudgeon and the nice person are on a road show again, traveling back to Montana to take care of business:  real estate, visit clients, and taking care of the off-the-grid cabin.  As usual, we make space in the car for the computer, as we have for more than 20 years.  In the beginning, it was a business necessity, but now having a computer or smart phone is almost essential for the traveler.

My first traveling computer was a TI-40 Compact Computer, a one-line LCD display programmed in BASIC, not much bigger than a large calculator, that I bought in an after-Christmas closeout in January 1985.

Our first true full “portable” computer was a luggable 8086 Sperry Portable, which had a 5-inch CRT and weighed about 40 pounds.  It spent a lot of its life with us as an RS-232 serial terminal, but worked fine with a modem, too.  It spent so much time at the project room in grad school that the University slapped a property sticker on it, because I couldn’t lug it back and forth all the time.  For some reason, we never got around to giving it a name.  It ran MS-DOS4 and had only floppy drives.  I bought it used–it had belonged to a computer training company and they stripped the hard drives out before liquidating them.

The second portable was duncan, an NCR pen-top computer that ran Windows for Pen 1.0.  It was a 386 machine, with a 20MB hard drive.  I packed a full-sized keyboard and a Telebit QBlazer modem on road trips with that one.  Airport security didn’t like it very much, because it was odd.  It got a name because I used UUPC, a shareware port of UUCP to MS-DOS, so it sent and received email over the modem, exchanging with our Coherent (Unix ‘clone’) machine at home.

Our first true laptop computer was darek, a CTX700 with a 200MHz Pentium MMX, 40MB of RAM (a major upgrade from 8MB),  and a 2GB hard drive.  It came with Windows 95, but spent most of its life running Red Hat Linux 5 or SuSE 6, so it was our first Unix machine  on the road.  It went everywhere, and I even used it in flight a few times, but, I had to put the screen on my lap and touch-type with the keyboard against my chest.  It got too small to run regular distributions anymore, so had to be retired, finally.  It had a 56KB modem internal and a 10BaseT Ethernet, with interchangeable CD, floppy, and battery modules.  You could run it with the CD, with a Floppy, with both on AC, or with two batteries for extended run and no peripherals.

With no budget for a new laptop, I built a Mini-ITX-based portable, named “nikita,” with a CD and 80GB hard dirve, using a 15-inch LCD monitor and a laptop-sized USB keyboard,  but the VIA C3 chipset DMA was unstable, which meant it couldn’t run a lot of things at the same time without crashing.  The machine got FreeBSD installed on it, a second Ethernet card, and became the router for our home network, which it handled quite well.  After five years of faithful service, it did not survive the move to Washington.  Sometimes hard drives seize up if allowed to get cold.

Our current road machine is an HP Compaq Presario 714NR, which came with Windows Vista on it, and now runs Ubuntu 9.10, having been upgraded every six months since Ubuntu 7.10.  It has had a bit of memory transfusion, too, up to 2GB and begging for more.  It’s got 100BaseT Ethernet and an 802.11G wireless, and an 80GB hard drive, with DVD-R/W.  The big issue with this machine is the Broadcom wireless, which has been problematic for Linux, but finally seems to be under control.  Having a dual-core CPU helps a lot, but more RAM is needed.  The laptop has become a primary machine, at least until the recession recedes, but it is essential to have everything available on the road.

Travel in the 21st century means staying at motels that have free wi-fi, seeking out coffee shops with wi-fi, and war-driving in small towns while on the go or staying at places without network connections.  I’ve been writing this in motels and by grabbing wireless where I can.  It has become inconceivable to travel any distance without having access to the Internet and one’s files.  The cloud is becoming a repository for a lot of data that we used to have to synch between machines before going off on a trip.  As much as I use the laptop as a primary computer, I still need more power in my office to do “real” work, i.e., clustering, testing with alternate OSes, etc.

One issue with travel is keeping track of your things.  Like our tandem bicycle, I never let the laptop out of my sight.  There’s a lot of data in there, and a lot of inconvenience to lose your computing power.  I do a backup before heading out, but that doesn’t help on the road.  A better solution is to have a portable backup disk and keep it separate from the laptop.  Our situation is a bit peculiar, too, as we can’t readily replace a Linux laptop if it fails or goes on walkabout without permission.  Keeping a distro CD handy (again, in another bag) is an option, but inconvenient.

The Internet is essential business and situational intelligence.  The ability to look at real-time weather condtions on the mountain pass ahead is invaluable, as is the access to up-to-date maps and geographic databases.  I searched in vain for a place that wasn’t listed in my 3-year-old GPS database (too cheap to upgrade–bad mistake), but readily found it once I was able to “jack in” to the Internet on the laptop.

It’s been a strange 25 years to go from having essentially a programmable calculator to help with fuel and schedule management to running a business on the road out of a briefcase stuffed with what would have been a large mainframe in 1985.

Windows ‘Hives’–Beware of Stings

Picture this scenario:   a typical family on a Friday evening, Dad relaxing before tackling those term papers for college classes, oldest son on the computer, Mom finishing up in the kitchen, the toddlers fussily wearing down into that cranky, tired but not yet bedtime limbo.  Then…

“Mom, there’s something wrong with the computer.”   Mom picks up the youngest and looks over Junior’s shoulder.  There’s browser windows all over the screen.  Nothing is happening.  Junior clicks on the corner of one.  Nothing.  It’s frozen.

“Just shut it off,” she says.  He clicks on the Start icon.  Nothing.  So, she reaches down and unplugs the computer.  She plugs it back in, and it boots up. To a blank desktop.  The on-line tickets she purchased a few days ago, Dad’s college term papers, all the familiar icons, gone…  This is not good.

She calls her aunt, who is always on the computer.  She’ll know what to do.  “Ah, that sounds bad,” says her aunt.  ” Maybe you should call your grandfather.  He’s a real computer expert.”   Now, you don’t hear that very often: usually it’s the grandkids helping grandma with her email.  But, she does, anyway.  Call her grandfather, that is.

The phone rings, and I pick up.  It takes a bit to connect the dots, because she’s in a panic, and I have to switch from my business voice to my “papa” voice.  The kids and grandkids hardly ever call–mostly we email and exchange pictures on Facebook, that sort of thing.  After a few niceties, she relates this tale of woe.  The “user-friendly” computer has turned nasty.

Like a lot of Unix users and system administrators, I dread these calls. Oh, don’t get me wrong, I love hearing from kids and grandkids, but I don’t do Windows.  Our relatives know we speak in binary to our computers, and will just gesture hypnotically at the recalcitrant machine and it will suddenly behave.  Clarke’s Law at work, I suppose–“Any technology sufficiently different from your own is indistinguishable from magic.”  And computers are just, well, computers, aren’t they?  Windows is the computer as far as they know.

After I get her calmed down a bit, I explain I really don’t do Windows, but I’ll think about it, meanwhile remotely booting up a copy of Windows XP in a virtual machine on a server somewhere else in Chaos Central, so I can remember what Windows looks like and where things are.  The usual and customary assumption when someone’s computer “goes south” is that it has picked up a virus or has gotten so loaded with spyware it can’t function anymore.  But this doesn’t sound like the issue.

The computer seems to be running normally.  I think about having her boot up in Safe mode, but the F8 key is marked “Terra Incognita” on her mental map of the keyboard, and you have to be fast.  She relates an error message that flashes on the screen, to the effect something can’t be loaded, whatever.  Aha, I think, this sounds more like it.  I ask her to open the “My Computer” icon, open the “Hard Drive C:” and burrow down into the Documents and Settings and look through the Desktop folders in each user.  Like a lot of Windows families, the concept of separate user spaces is lost on them: it’s just “the computer” and everyone uses it just the way it is, is defaults to the “Owner” account.

The panic strain in her voice disappears and turns to joy–she’s found the files.  They’re all there.  I’m not sure I can explain more over the phone, so we talk a bit about how her husband’s schooling is going–he’s got one more semester to finish his degree at long last and fulfill his dream of becoming a teacher.  The great-grandkids are growing, too.  the oldest girl is going to be three.  The baby is whining for attention and obviously walking, I can hear her in the background.   We’ll be down to visit before they go on the vacation she thought she lost the tickets for…

Later, I send her the “official” solution from the Microsoft Knowledge Base, so she can clean up the mess if she can wade through the arcane kludge.  Otherwise, it may just have to wait a couple months until we visit, though that’s the other pet peeve of Unix admins everywhere.  We didn’t go on vacation to visit your Windows computer, we came to see you.  And, we have our own Linux computer in the car, thanks.

And, the winning answer, from Microsoft?  The gist of it is, when this happens: create a new user account and copy all the files from the old account into it.

What?  You’re saying it actually can’t be fixed?

Yup, that’s right.  You have to start over.  But your files are still there…

So, how does this happen, that the biggest computer software company on the planet has a feature built into the very fabric of their system, one that controls the user’s view of the system, that every so often just crashes and can’t be fixed?

Well, boys and girls, once upon a time, the designers of what came to be known as Windows New Technology, or Windows NT, desiring to enter the world of corporate computing and take on the “big iron” contenders like IBM, DEC, and all the Unix vendors, built a system configuration structure in which to keep all the important information about the installed software, current state of the system.  The structure was a convoluted, distributed set of databases.  Although the organization of it was tree-like, they didn’t want it to be confused with the also-tree-like file system, so they invented a new term.  The named this structure The Registry (with capitals so we would know it was important), and called it a “hive.”  Apparently the diagrams used by the designers to describe it to the programmers looked like beehives, so the name stuck.

It is probably appropriate, then, that, when you mess with the hive, you get stung.  Now, most Windows users are blissfully unaware of the Registry and its hive-like structure, but it is there.  It turns out that one of the components of this hive is a database for each user account, a file called NTUSER.DAT (Windows has never gotten over its humble origins in MS-DOS, and still uses all-caps, 8-character plus three-character file extension names for really important files) and its associated log file.  It’s right there, in your home folder, when you logon, but you don’t see it, because it is “hidden” from the directory view.  Besides, it’s a system file, so you can’t mess with it anyway, at least not on purpose.

What users also don’t know is that the Registry, of which their very own personal NTUSER.DAT file is part, has a finite size, defaulted to some fairly low value, deemed to be sufficient until the next upgrade of Windows, or your next computer purchase, both of which happen on average every two or three years, in the minds of the marketeers, anyway.  There’s a couple things wrong with that.  Most home users keep their computers until the machine dies a natural death, which is anywhere from five to eight years, and keep the original software, and then you have those predictable delays when it takes Microsoft five years to come out with an unusable replacement that nobody is willing to replace their computer to get.

So, over that long period of time, a time bomb is slowly ticking inside the machine.  As you use your machine, that NTUSER.DAT file grows.  And grows.  When you drop things on your desktop, it gets bigger. When you move them somewhere else, or delete them, it gets bigger.  There is no way to shrink it.  Now, if you really, really know what you are doing, you can give the Registry more space, but your computer’s performance will suffer, so that’s not a satisfactory solution either (one of the signs that your NTUSER.DAT file is getting too big is that your computer is getting slower and slower).  Eventually, the sum total of the Registry files reaches the magic limit, and you can’t write any more data into it.  Usually, this happens in the middle of some transaction, which just stops, leaving the file in an ambiguous state, which means, in plain English, it is broken.  When you logon, Windows complains, and then your desktop is blank.  Game over.

In the Windows game,  you don’t get a new life, you have to get a new character, so to speak.  This is just so wrong on so many levels I don’t know where to start.  The Windows configuration is so complex that even highly experienced, certified Windows administrators and engineers have trouble dealing with it.   Some parts of The Registry can be edited, with special tools, but there are no checks and balances, so it is really easy to turn your computer into an expensive doorstop with one errant keystroke.  Oh, you do have one chance to recover, by using Safe Mode (which, as we explained earlier, is difficult or incomprehensible for the average user to do), but don’t count on it.  Windows administrators have this secret mantra they chant when users are not listening:  “Reboot often, and, when in doubt, reinstall.”  Well, most users don’t back up their files regularly, so reinstalling is just like getting a new computer, minus all your music, photos, games, term papers due next week, and financial records.

So, when us crusty old Unix curmudgeons say, “I don’t do Windows,” we mean it.  We know it wasn’t your fault that your computer came with this abomination installed on it, so we sometimes take pity on you.  But, we believe that Windows is broken “out of the box,” and the only way to fix the computer is to replace Windows with something else.  If we get our hands on your broken Windows installation, don’t be surprised if it has Linux on it instead when you get it back.

Scrivener’s Lament–Writing for Robots

Unix is a great system, designed to make writing programs fast and easy.  So, being a basically lazy person, I like to write programs for Unix.  A few years back, I realized that the best way to get most people to use Unix was to write programs they could operate from the web browsers on the default operating system that came with their computers, which, as we all know, is not Unix, and probably won’t be for some time to come.  Thus was born a concept that we know today as “cloud computing.”  Too bad I didn’t promote the idea outside my customer base.  I might be retired today, out riding my bike instead of sitting in the basement grinding web grist on a sunny day.

But, to get to the point, which will be short, since there is a lot of information on the web these days about writing for the web, this paradigm shift to the web led in strange directions.  In addition to being a system administrator and software engineer, I became an accidental web designer.  Abandoning my knowledge of curses escape sequences (for text terminals) and even my PerlTk skills (a toolkit library for generating graphical interfaces), I embraced HTML and Cascading Style Sheets, but kept on coding in Perl, PHP, and Ruby to create interactive web applications.

All was well, and I was happily coding solutions to problems and hanging web page interfaces on them, until I got a comment from one of my clients, to the effect, “We’re not getting enough hits on our site.  How do we attract more traffic?”   OK, I’d been building web pages since the mid-1990s and knew a bit about meta tags and keywords, and how to keep the search engines out of your back-end data with the robots.txt file, but, spending most of my time building application front-ends for a pre-selected audience, hadn’t paid much attention to marketing.  That’s the job of the client, I thought.  “It’s all about the content,” I said. “I can write great code and know enough about human-computer interface principles to put together a decent page and site layout, but you,  Ms. Client, know your business best.  Write content that says what your customers want to hear when they shop for your product.”  Well, it wasn’t all that simple.

By now, I started seeing SPAM show up in my mailbox, addressed to ‘webmaster,’ that claimed my sites weren’t #1 and what I needed was SEO.  The industry newletters and blogs I subscribe to were talking about the same thing: Search Engine Optimization.  This is what I need, but I’m not going to hire it out, I’m going to do it myself.  First, I clean up my page design and code to spin page titles and keywords, but that’s only half the issue.  We need some press.  We need links, we need the search engines to really notice us, out of all the competing web sites.  There’s more to SEO than structure, and the content issue gets complicated.

The web evolved into a major marketing channel since I started in this business.  The admen figured out what gets the attention of the search engine robots, and they spammed the robots, by hook or by crook.  The first, by stuffing the meta tags with keywords, and the second, by flooding the pages with invisible keywords (text the color of the background).   Suddenly, searches for new cars, baby blankets, advice on getting rid of moles in the lawn, and whatever else all came up with lists of top porn sites.  This wouldn’t do, so the robots started ignoring sites with lots of keywords, as a matter of principle.  Keywords still work, but in a minimal fashion.   Robots got smart enough to look at the fonts and colors, too, and only looked at visible text.   The folks who desperately want their site in your face at all cost started filling page footers with irrelevant keywords in tiny print, and the chase was on.

In any ecosystem, organisms evolve or die.  The robots evolved, becoming critical readers of the web pages, to a point, looking, like Goldilocks at the Three Bears’ house, for things that weren’t too hot, too cold, too hard, or too soft, but “just right.”  All this effort is with the goal of separating the good sites, that have something useful to say about the keywords, from the bad sites, that want to divert your attention to them instead of what you were really looking for.  The kind of writing finesse it takes to pass the robots’ critique is just way outside the scope of what most clients can or want to do.  So, I became an SEO copywriter, at least part time.

An SEO copywriter is essentially a person who writes for robots.  Now, the robots are supposedly trying to judge page content by looking for qualities that human readers would find useful or interesting.  But, robots don’t buy cars, they don’t shop for shoes or diet pills, and they don’t get embarrassing skin conditions, so they pay more attention to the hot/cold, hard/soft aspects and not much at all to the color, pattern, and overall aesthetics.  It is said about business reports that they must be accurate, concise (or proper level of detail), and pretty.  Robots get the concise part fairly well, but they don’t have the life experience to judge accuracy nor appearance.  The successful SEO copywriter isn’t just a person who knows how to please the robots, so they will invite real people to view the site, but a good enough writer so the information will be useful and attractive to the viewer.

The accurate part takes knowledge of the subject matter, or at least the ability to research it.  The pretty part is even more difficult: the writing must be grammatically correct,  involve the reader, and flow smoothly for easy reading.  The latter becomes a huge task, since the goal is to sprinkle in some pretty arbitrary keyword phrases in at the correct level of detail to satisfy the robots that this is indeed the subject matter and not a diversion.  When I’m working directly with a client, I try to dissuade them from using awkward keyword phrases, but sometimes the one they want is what people searching for that particular product or service invariably use, so it’s a challenge to create something that both people and robots will like to read.

Treading the narrow path between what robots consider good copy and what humans will willingly read is a tedious art.  There are a lot of SEO shops out there, churning out volumes of what is essentially ad copy that robots will buy.  The sheer volume and low pay scale guarantees that the least professional examples are senseless drivel, and the best of them are a quirky sort of formula read at best.  The SEO article is the dime novel of the 21st century, churned out in weekly installments for the robot market.  When I tried my hand (unsuccessfully, I might add) at submissions for the pulp sci-fi fanzines in the 1950s, they paid a half-penny to one cent per word.  Now, over 50 years and several hundred percent inflation later, writing SEO copy for the web pays a bit less than a half-cent a word.  Such is progress.  But, it’s an outlet, and it’s writing practice, and practice makes perfect.  The great American novel is just waiting to be written.  The question is, will it be on the robot best-seller list?

The Elephant in the Room

Last week, a citizen revolt sprang up in “our fair city,” a small town on the backwaters of Puget Sound, nestled against the forested foothills of the Olympic Mountains. Our town’s existence is based on wood products, an industry vital to the core of our civilization, but one with consequences. Having endured one and a quarter centuries of the realities of wood production, the local population of one of the least healthy counties in Washington State finally has cried, “Enough.”

The controversy arose over a proposed biomass-fueled powerplant to be constructed on the outskirts of town, using “Bubbling Fluidized Bed” (BFB) technology.  This sounded neat and green and high-tech–at first.  Scientists concerned with health issues have stipulated that the current air quality regulations, established in the early days of the 40-year-old Evironmental Protection Agency, are not adequate to address health risks from microbe-sized particulates (<2.5um). This range of particulate sizes is produced in higher percentage of volume from BFBs than from the technologies in use when the regulations were written.  And, there are other issues of environmental impact on the community, in terms of noise and traffic, as well as dubious economic value due to tax incentives, relatively small permanent staff, and out-of-state sale of the “product.”

Citizens have a right to question the impacts of the introduction into their community of a potentially-harmful technology that will have an overall negative effect on the community. Though not required by law, because of the size and scope of the project, it is imperative to provide some level of independent impact study before such a project is approved. Throughout the history of the industrial age, communities have not seen a lasting benefit from industrial operations: across the planet, industries have built facilities, an activity that temporarily burdens the locale with a large workforce while providing few permanent jobs. The plant operations produce large amounts of waste, which contaminate the local environment for decades after the profits have been taken and the plants close, further burdening the population with the cost of cleanup. And, ineffective or poorly-enforced regulations permit the construction and operation of profit-making ventures that damage the health of the community and impose financial burdens on the infrastructure far in excess of any tax revenue realized, while creating few permanent jobs for the current residents.

It was no wonder, then, that hundreds of citizens turned out to raise these issues at a recent joint workshop held by the city, county, and public utility district commissions. But, the commissions sidestepped the issues neatly: the proposed plant was nowhere mentioned in the meeting agenda, and the workshop chairperson announced that it would not be discussed. Nevertheless, the BFB powerplant was obviously the elephant in the room. Each and every item on the agenda, save one, addressed issues that bear directly on the infrastructure necessary to implement the project: expansion of water and sewer services to the industrial district (euphemistically referred to as the “Urban Growth Area”) and traffic modifications to the roadways that provide a route for the supply trucks to the proposed plant site. Although it is clear that these civic improvements are needed to sustain the economic viability of the area, promoting a facility of dubious benefit and possible harm to the community without community input is not a responsible approach: the county and city commissions need to consider their constituency’s concerns.

As a closing shot, briefly acknowledging the elephant, one of the commissioners insulted the assembled audience by comparing a natural-gas/wood-pellet boiler at a state facility in the county to the proposed BFB wood-waste boiler. That’s like comparing the impact of your hybrid sedan with that of a logging truck. If our elected officials don’t themselves know the difference, it is time to educate them, or replace them, before more damage is done.

Citizens are increasingly aware of the long-term effects of our industrial society: there have been too many spectacular examples in the last half-century to welcome any industry with open arms because it promises a few jobs: Love Canal, the Exxon Valdez, the currently developing crisis at the Deepwater Horizon site; lesser-known disasters like the asbestos tragedy in Libby, Montana; and, the cronic on-going tragedy of health impacts and mine explosions in the coal industry. Concerned citizens cannot be dismissed as “certified kooks” because they have unanswered questions to what they perceive as real threats. In cases where citizens were successful in obtaining a fair, unbiased environmental impact assessment, such as the recently-completed Biosafety Level 4 facility at the National Institutes of Health’s Rocky Mountain Laboratories, in the middle of a small town in Montana, the outcome was positive. The facility was not only ultimately built, after a year’s delay, but modifications were made to the entire 30-acre research campus to reduce noise and emissions, and to improve traffic flow to the facility. Also, impacts on the community infrastructure were reduced by augmenting the staff with counterparts to community police, fire, and medical support. The community opposition was defused through an extensive outreach and education effort to answer legitimate questions–and, in turn, incorporate those issues into planning and design–as well as to dispel unfounded rumors about the dangers to the public and ultimate purpose of the facility.

Background

To completely understand the wood-waste powerplant controversy, one must consider the ecosystem, and how this proposed facility fits into the local ecology. Wood is a complex living thing. In the natural cycle of the forest, it grows over decades and centuries, absorbing carbon from the soil and atmosphere, and returning oxygen to sustain the animal life around it. When it dies, it decomposes over many decades and centuries, returning most of that carbon to the soil, while the nitrogen and sulfur-based compounds that make up the functional cellular structure of all living things becomes food for the microbes, insects, and new trees around it. Occasionally, when a forest gets overcrowded, or too many trees in one location are killed by disease, the forest ignites in summer lightning storms, dispersing its chemical legacy far and wide, so the forest heals and is renewed.

But, when man uses the forest as a source of fiber for building materials, this cycle is disrupted. Instead of a 200 to 500 year cycle of growth and decay, the growing cycle is limited to the lifespan of the foresters, 60-70 years. The decay cycle is shortened to days or minutes. This creates a carbon imbalance between the atmosphere and biomass. It also releases the nitrogen and sulfur compounds all at once, in toxic quantities. A fractional percentage of the original tree comes out of this process in the form of fiber: lumber or paper. The effects of this accelerated embalming process have been devastating to the local ecology.

Within a few decades of the arrival of the industrial civilization to this region, the rich shellfish beds in the sheltered bays of the lower Sound were gone, over-harvested and then poisoned by the effluent of the tree-embalming process. The food source of local peoples for thousands of years was destroyed in a single generation, and some especially-prized species driven to extinction. Over the succeeding century, some of the more damaging practices have been eliminated, either by attrition as the opportunists exhaust the resource, or by regulation. Unfortunately, regulation has been largely designed to benefit commerce, by limiting short-term risk while maximizing short-term profits.

Concerned citizens in all parts of this country are starting to look at long-term solutions that benefit the ecology, of which they are also a part. These solutions must include policies and practices which most closely follow the natural life cycles of the planet while still permitting commercial exploitation of resources. Those who lived here before us knew how to do this: the indigenous peoples used the forest and the waterways with no more impact than the occasional storm, flood, or forest fire. In the drier inland forests of Idaho and Montana, when the indigenous peoples were forced to abandon their stewardship of the land, what had been taken by the settlers to be “natural” forest quickly reverted to overgrown conditions, resulting, in a generation, in the most devastating forest fire to date, the 1910 Bitterroot fire, that destroyed towns and affected air quality and crop production from Idaho to the Dakotas.

The regulatory response to the 1910 fire was to create the National Forest Service, whose goal was not to preserve the ecology, but to maximize forest production by suppressing fires whenever possible, not only disrupting the natural cycle of the forest, but promoting the continuation of unsound harvesting practices. The failure of this system has been well-documented, and is evident from the reforms of the past generation that purport to secure clean air and clean water by controlling the effluent of these industrial practices–without changing the fundamental practices themselves.

Technology

We have, today, a proposal to generate electrical power by combustion of raw wood products, i.e., woody parts of trees so recently killed that the needles and leaves are still green. The proposed technology, a Bubbling Fluidized Bed (BFB) boiler, was developed in 1979 in an attempt to reduce toxic emissions from burning coal. However, there is growing evidence that the technology produces a higher percentage of fine particulate, which is not regulated, in the exhaust. Over the past decade, this technology has been adapted for burning sludge and wood waste from paper production, an historically notoriously polluting industry, with the primary goal of disposing of the waste, with a token amount of power generation as a secondary by-product. Because coal and wood waste are chemically much different, the operation of these plants has not been the most optimal, for a number of reasons:

  • Coal has very low water content, while wood waste has, on average, more than 50% water content, greatly limiting the heat output per unit weight of the fuel, a large percentage of which is consumed in vaporizing the water, so that the overall efficiency is nominally25%, resulting in correspondingly higher carbon output in the
    exhaust.
  • Because the composition and water content of the fuel is variable, the operational characteristics of the burner must be constantly monitored and regulated to keep the production of nitrogen and sulfur-based compounds within the prescribed limits.
  • The low heat content of the primary fuel often requires the addition of supplemental fuels to maintain the system output at efficient levels. In similar plants to the one proposed, supplemental fuels have ranged from primary wood fiber to discarded tires. The most likely source of supplemental fuel for this plant would be trees
    that might otherwise be milled into lumber, or processed wood fiber.
  • Most coal-fired plants are supplied by railroads, directly from coal fields, which is a fairly efficient means of transport. Most wood-waste plants are supplied from an adjacent wood-products facility, such as a paper mill, and provide a means of efficiently disposing of a by-product of that facility. The proposed facility depends on fuel brought by trucks from distant logging operations, where it would otherwise be disposed of by open burning or natural decomposition, with a further decrease in net energy production and and increase in the immediate carbon release.
  • BFBs depend on having the fuel supplied in a uniformly small particle size. Coal is fairly uniform in composition and density and easily crushed to size. Wood waste, unless in the form of sludge from other processes, must be ground to size from saplings and branches. The proposal states that burning of bark and leaves will be minimized. If the processing of wood waste into burnable form is done on-site, it appears that there will be a certain percentage of accumulation of secondary waste, with the potential for release of concentrated tannins into the groundwater unless there is a separate disposal plan for bark and needles.
  • Power generation from both coal and wood produces ash as a byproduct. Ash, as with most concentrated chemical by-products, is toxic, containing heavy metals and salts. In modern burners, some of the ash is composed of calcium sulfates generated by the addition of limestone to the burner to reduce sulfur dioxide and sulfuric acid emissions. Because of the low efficiency of wood-waste burners, the quantity of ash produced per megawatt-hour is substantially higher than that from other fuels.

    Conclusions

    The technical aspects of the project lead many scientists and engineers to question the wisdom of adapting existing technology to burn wood wastes as a “green” power source. Aside from the appalling inefficiency of power conversion and the difficulty in controlling the emissions in practice, there is the matter of logistics. The environmental impact of the acquisition of the fuel supply and disposal of the solid ash is of concern, and is not addressed in the permit application for the plant itself. A thorough environmental impact statement will address these issues: if the plant is ultimately approved, it will be with the consent of the citizens who will have to live with it and its aftermath, not by token compliance with outmoded and incomplete regulations.

    Further Reading

    Kraft, D.I., “Bubbling Fluid Bed Boiler Emissions Firing Bark & Sludge,” 1998 TAPPI Engineering Conference, Atlanta

    Johnson, Leonard R., Bruce Lippke, John D. Marshall, and Jeffrey Comnick, “Life-Cycle Impacts of Forest Resource Activities in the Pacific Northwest and Southeast United States”, Wood and Fiber Science, December 2005, V. 37 Corrim Special Issue

    Risi, George F.; Marshall E. Bloom; Nancy P. Hoe; Thomas Arminio; Paul Carlson; Tamara Powers; Heinz Feldmann; Deborah Wilson, “Preparing a Community Hospital to Manage Work-related Exposures to Infectious
    Agents in BioSafety Level 3 and 4 Laboratories,” Emerging Infections Diseases. 2010;16(3)

    http://nobiomassburning.org/docs/Medical_health_letterFINAL2.pdf

    http://www.idahoforests.org/fires.htm

    http://www.nwcouncil.org/fw/subbasinplanning/bitterroot/ExecutiveSummary.pdf

    Emerson, Rebecca, “Biosafety Regulations: Who’s Watching the Lab? Safety in High Risk Infectious Diseases Research,” 2006 Temple University School of Law, http://www.temple.edu/law/tjstel/2006/fall/v25no2-Emerson.pdf

    About the Author:

    Larye D. Parkins is an information technology professional with 45 years experience in systems engineering and software engineering. He received a B.A. in Physics from Wartburg College in 1965 and a Master of Software Engineering degree from Seattle University in 1991. He has been involved in the design and development of computing facilities and software in industrial settings for naval combat systems and biosafety facilities. His current projects include support of high-performance computing infrastructure for bioinformatics at the Rocky Mountain Laboratories, NIAID, NIH, and he has contributed design and implementation of genomic data analyses for numerous articles published in peer-reviewed scientific journals such as the Proceedings of the National Academy of Sciences. In the 1970s and 1980s, he was involved with development and modification of combat systems for the U.S. Navy Fast Attack Submarine projects and the Trident Ballistic Missile Submarine data processing subsystem. Over the past three decades, he has hiked and biked extensively on Washington’s Olympic Peninsula and in the Bitterroot Valley of Montana. He currently resides in Shelton, Washington and Hamilton, Montana.