Wednesday, March 25, 2015

Planet of the Space Bats

As my regular readers know, I’ve been talking for quite a while now here about the speculative bubble that’s built up around the fracking phenomenon, and the catastrophic bust that’s guaranteed to follow so vast and delusional a boom. Over the six months or so, I’ve noted the arrival of one warning sign after another of the impending crash. As the saying has it, though, it’s not over ‘til the fat lady sings, so I’ve been listening for the first notes of the metaphorical aria that, in the best Wagnerian style, will rise above the orchestral score as the fracking industry’s surrogate Valhalla finally bursts into flames and goes crashing down into the Rhine.
 
I think I just heard those first high notes, though, in an improbable place: the email inbox of the Ancient Order of Druids in America (AODA), the Druid order I head.

I have no idea how many of my readers know the first thing about my unpaid day job as chief executive—the official title is Grand Archdruid—of one of the two dozen or so Druid orders in the western world. Most of what goes into that job, and the admittedly eccentric minority religious tradition behind it, has no relevance to the present subject. Still, I think most people know that Druids revere the natural world, and take ecology seriously even when that requires scrapping some of the absurd extravagances that pass for a normal lifestyle these days. Thus a Druid order is arguably the last place that would come to mind if you wanted to sell stock in a fracking company.

Nonetheless, that’s what happened. The bemused AODA office staff the other day fielded a solicitation from a stock firm trying to get Druids to invest their assets in the fracking industry.

Does that sound like a desperation move to you, dear reader? It certainly does to me—and there’s good reason to think that it probably sounds that way to the people who are trying to sell shares in fracking firms to one final round of clueless chumps, too. A recent piece in the Wall Street Journal (available outside the paywall here) noted that American banks have suddenly found themselves stuck with tens of millions of dollars’ worth of loans to fracking firms which they hoped to package up and sell to investors—but suddenly nobody’s buying. Bankruptcies and mass layoffs are becoming an everyday occurrence in the fracking industry, and the price of oil continues to lurch down as producers maximize production for the sake of immediate cash flow.

Why, though, isn’t the drop in the price of oil being met by an upsurge in consumption that drives the price back up, as the accepted rules of economics would predict? That’s the cream of the jest. Here in America, and to a lesser extent elsewhere in the industrial world, four decades of enthusiastically bipartisan policies that benefited the rich at everyone else’s expense managed to prove Henry Ford’s famous argument: if you don’t pay your own employees enough that they can afford to buy your products, sooner or later, you’re going to go broke.

By driving down wages and forcing an ever larger fraction of the US population into permanent unemployment and poverty, the movers and shakers of America’s political class have managed to trigger a classic crisis of overproduction, in which goods go begging for buyers because too few people can afford to buy them at any price that will pay for their production. It’s not just oil that’s affected, either: scores of other commodities are plunging in price as the global economy tips over into depression. There’s a specter haunting the industrial world; it’s the ghost of Karl Marx, laughing with mordant glee as the soi-disant masters of the universe, having crushed his misbegotten Soviet stepchildren, go all out to make his prophecy of capitalism’s self-immolation look remarkably prescient.

The soaring price of crude oil in the wake of the 2005 global peak of conventional oil production should have served notice to the industrial world that, to adapt the title of Richard Heinberg’s excellent 2003 summary of the situation, the party was over:  the long era in which energy supplies had increased year over year was giving way to an unwelcome new reality in which decreasing energy supplies and increasing environmental blowback were the defining themes. As my readers doubtless noticed, though, the only people who willing to grasp that were out here on the fringes where archdruids lurk. Closer to the mainstream of our collective thinking, most people scrunched shut their eyes, plugged their ears with their fingers, and shouted “La, la, la, I can’t hear you” at the top of their lungs, in a desperate attempt to keep reality from getting a word in edgewise.

For the last five years or so, any attempt to talk about the impending twilight of the age of oil thus ran headfirst into a flurry of pro-fracking propaganda. Fatuous twaddle about America’s inevitable future as the world’s new energy superpower took the place of serious discussions of the predicament into which we’ve backed ourselves—and not for the first time, either. That’s what makes the attempt to get Druids to invest their life savings in fracking so funny, in a bleak sort of way: it’s an attempt to do for the fracking boom what the fracking boom attempted to do for industrial civilization as a whole—to pretend, in the teeth of the facts, that the unsustainable can be sustained for just a little while longer.

A few months back, I decided to celebrate this sort of thinking by way of the grand old Druid custom of satire. The Great Squirrel Case Challenge of 2015 solicited mock proposals for solving the world’s energy problems that were even nuttier than the ones in the mainstream media. That was no small challenge—a detail some of my readers pointed up by forwarding any number of clueless stories from the mainstream media loudly praising energy boondoggles of one kind or another.

I’m delighted to say, though, that the response was even better than I’d hoped for.  The contest fielded more than thirty entries, ranging from the merely very good to the sidesplittingly funny. There were two winners, one chosen by the members of the Green Wizards forum, one chosen by me; in both cases, it was no easy choice, and if I had enough author’s copies of my new book After Progress, I’d probably just up and given prizes to all the entries, they were that good. Still, it’s my honor to announce the winners:

My choice for best squirrel case—drumroll, please—goes to Steve Morgan, for his fine gosh-wow sales prospectus for, ahem, Shares of Hydrocarbons Imported from Titan. The Green Wizards forum choice—drumroll again—goes to Jason Heppenstall for his hilarious parody of a sycophantic media story, King Solomon’s Miners. Please join me in congratulating them. (Steve and Jason, drop me a comment with your mailing addresses, marked not for posting, and I’ll get your prizes on the way.)

Their hard-won triumph probably won’t last long. In the months and years ahead, I expect to see claims even more ludicrous being taken oh-so-seriously by the mainstream media, because the alternative is to face up to just how badly we’ve bungled the opportunities of the last four decades or so and just how rough a road we have ahead of us as a result. What gave the fracking bubble whatever plausibility it ever had, after all, was the way it fed on one of the faith-based credos at the heart of contemporary popular culture: the insistence, as pervasive as it is irrational, that the universe is somehow obligated to hand us abundant new energy sources to replace the ones we’ve already used so profligately. Lacking that blind faith, it would have been obvious to everyone—as it was to those of us in the peak oil community—that the fracking industry was scraping the bottom of the barrel and pretending that this proved the barrel was full.

Read the morning news with eyes freed from the deathgrip of the conventional wisdom and it’s brutally obvious that that’s what happened, and that the decline and fall of our civilization is well under way. Here in the US, a quarter of the country is in the fourth year of record drought, with snowpack on California’s Sierra Nevada mountains about 9% of normal; the Gulf Stream is slowing to a crawl due to the rapid melting of the Greenland ice sheets; permanent joblessness and grinding poverty have become pervasive in this country; the national infrastructure is coming apart after decades of malign neglect—well, I could go on; if you want to know what life is like in a falling civilization, go look out the window.

In the mainstream media, on the occasions when such things are mentioned at all, they’re treated as disconnected factoids irrelevant to the big picture. Most people haven’t yet grasped that these things are the big picture—that while we’re daydreaming about an assortment of shiny futures that look more or less like the present with more toys, climate change, resource depletion, collapsing infrastructure, economic contraction, and the implosion of political and cultural institutions are creating the future we’re going to inhabit. Too many of us suffer from a weird inability to imagine a future that isn’t simply a continuation of the present, even when such a future stands knocking at our own front doors.

So vast a failure of imagination can’t be overcome by the simple expedient of pointing out the ways that it’s already failed to explain the world in which we live. That said, there are other ways to break the grip of the conventional wisdom, and I’m pleased to say that one of those other ways seems to be making modest but definite headway just now.

Longtime readers here will remember that in 2011, this blog launched a contest for short stories about the kind of future we can actually expect—a future in which no deus ex machina saves industrial civilization from the exhaustion of its resource base, the deterioration of the natural systems that support it, and the normal process of decline and fall. That contest resulted in an anthology, After Oil: SF Stories of a Post-Petroleum Future, which found a surprisingly large audience. On the strength of its success, I ran a second contest in 2014, which resulted in two more volumes—After Oil 2: The Years of Crisis, which is now available, and After Oil 3: The Years of Rebirth, which is in preparation. Demand for the original volume has remained steady, and the second is selling well; after a conversation with the publisher, I’m pleased to announce that we’re going to do it again, with a slight twist.

The basic rules are mostly the same as before:

Stories should be between 2500 and 7500 words in length;
They should be entirely the work of their author or authors, and should not borrow characters or setting from someone else’s work;
They should be in English, with correct spelling, grammar and punctuation;
They should be stories—narratives with a plot and characters—and not simply a guided tour of some corner of the future as the author imagines it;
They should be set in our future, not in an alternate history or on some other planet;
They should be works of realistic fiction or science fiction, not magical or supernatural fantasy—that is, the setting and story should follow the laws of nature as those are presently understood;
They should take place in settings subject to thermodynamic, ecological, and economic limits to growth; and as before,
They must not rely on “alien space bats”—that is, dei ex machina inserted to allow humanity to dodge the consequences of the limits to growth. (Aspiring authors might want to read the whole “Alien Space Bats” post for a more detailed explanation of what I mean here; reading the stories from one or both of the published After Oil volumes might also be a good plan.)

This time, though, I’m adding an additional rule:

Stories submitted for this contest must be set at least one thousand years in the future—that is, after March 25, 3015 in our calendar.

That’s partly a reflection of a common pattern in entries for the two previous contests, and partly something deeper. The common pattern? A great many authors submitted stories that were set during or immediately after the collapse of industrial civilization; there’s certainly room for those, enough so that the entire second volume is basically devoted to them, but tales of surviving decline and fall are only a small fraction of the galaxy of potential stories that would fit within the rules listed above.  I’d like to encourage entrants to consider telling something different, at least this time.

The deeper dimension? That’s a reflection of the blindness of the imagination discussed earlier in this post, the inability of so many people to think of a future that isn’t simply a prolongation of the present. Stories set in the immediate aftermath of our civilization don’t necessarily challenge that, and I think it’s high time to start talking about futures that are genuinely other—neither utopia nor oblivion, but different, radically different, from the linear extrapolations from the present that fill so many people’s imaginations these days, and have an embarrassingly large role even in science fiction.

You have to read SF from more than a few decades back to grasp just how tight the grip of a single linear vision of the future has become on what used to be a much more freewheeling literature of ideas. In book after book, and even more in film after film, technologies that are obviously derived from ours, ideologies that are indistinguishable from ours, political and economic arrangements that could pass for ours, and attitudes and ideas that belong to this or that side of today’s cultural struggles get projected onto the future as though they’re the only imaginable options. This takes place even when there’s very good reason to think that the linear continuation of current trends isn’t an option at all—for example, the endlessly regurgitated, done-to-death trope of interstellar travel.

Let us please be real:  we aren’t going to the stars—not in our lifetimes, not in the lifetime of industrial civilization, not in the lifetime of our species. There are equally  good thermodynamic and economic reasons to believe that many of the other standard tropes of contemporary science fiction are just as unreachable—that, for example, limitless energy from gimmicks of the dilithium-crystal variety, artificial intelligences capable of human or superhuman thought, and the like belong to fantasy, not to the kind of science fiction that has any likelihood of becoming science fact. Any of my readers who want to insist that human beings can create anything they can imagine, by the way, are welcome to claim that, just as soon as they provide me with a working perpetual motion machine.

It’s surprisingly common to see people insist that the absence of the particular set of doodads common to today’s science fiction would condemn our descendants to a future of endless boredom. This attitude shows a bizarre stunting of the imagination—not least because stories about interstellar travel normally end up landing the protagonists in a world closely modeled on some past or present corner of the Earth. If our genus lasts as long as the average genus of vertebrate megafauna, we’ve got maybe ten million years ahead of us, or roughly two thousand times as long as all of recorded human history to date: more than enough time for human beings to come up with a dazzling assortment of creative, unexpected, radically different societies, technologies, and ways of facing the universe and themselves.

That’s what I’d like to see in submissions to this year’s Space Bats challenge—yes, it’ll be an annual thing from here on out, as long as the market for such stories remains lively. A thousand years from now, industrial civilization will be as far in the past as the Roman Empire was at the time of the Renaissance, and new human societies will have arisen to pass their own judgment on the relics of our age. Ten thousand years from now, or ten million? Those are also options. Fling yourself into the far future, far enough that today’s crises are matters for the history books, or tales out of ancient myth, or forgotten as completely as the crises and achievements of the Neanderthal people are today, and tell a story about human beings (or, potentially, post-human beings) confronting the challenges of their own time in their own way. Do it with verve and a good readable style, and your story may be be one of the ones chosen to appear in the pages of After Oil 4:  The Future’s Distant Shores.

The mechanics are pretty much the same as before. Write your story and post it to the internet—if you don’t have a blog, you can get one for free from Blogspot or Wordpress. Post a link to it in the comments to The Archdruid Report. You can write more than one story, but please let me know which one you want entered in the competition—there will be only one entry accepted per author this time. Stories must be written and posted online, and a link posted to this blog, by August 30, 2015 to be eligible for inclusion in the anthology.

Wednesday, March 18, 2015

The View From Outside

Recently I’ve been reacquainting myself with the stories of Clark Ashton Smith. Though he’s largely forgotten today, Smith was one of the leading lights of Weird Tales magazine during its 1930s golden age, ranking with H.P Lovecraft and Robert Howard as a craftsman of fantasy fiction. Like Lovecraft, Howard, and most of the other authors in the Weird Tales stable, Smith was an outsider; he spent his life in a small town in rural California; he was roundly ignored by the literary scene of his day, and returned the favor with gusto. With the twilight of the pulps, Smith’s work was consigned to the dustbin of literary history.  It was revived briefly during the fantasy boom of the 1970, only to sink from sight again when the fantasy genre drowned in a swamp of faux-medieval clichés thereafter.

There’s no shortage of reasons to give Smith another look today, starting with his mastery of image and atmosphere and the wry humor that shaped the best of his mature work. Still, that’s a theme for another time, and possibly another forum. The theme that’s relevant to this blog is woven into one of  Smith’s classic stories, The Dark Age. First published in 1938, it’s among the earliest science fiction stories I know of that revolves around an organized attempt to preserve modern science through a future age of barbarism.

The story’s worth reading in its own right, so I won’t hand out spoilers here. Still, I don’t think it will give away anything crucial to mention that one of the mainsprings of the story is the inability of the story’s scientists to find or make common ground with the neo-barbarian hill tribes around them. That aspect of the story has been much on my mind of late. Despite the rockets and rayguns that provide so much of its local color, science fiction is always about the present, which it displays in an unfamiliar light by showing a view from outside, from the distant perspective of an imaginary future.

That’s certainly true of Smith’s tale, which drew much of its force at the time of its composition from the widening chasm between the sciences and the rest of human culture that C.P. Snow discussed two decades later in his famous work “The Two Cultures.” That chasm has opened up a good deal further since Smith’s time, and its impact on the future deserves discussion here, not least because it’s starting to come into sight even through the myopic lenses of today’s popular culture.

I’m thinking here, for example, of a recent blog post by Scott Adams, the creator of the “Dilbert” comic strip. There’s a certain poetic justice in seeing popular culture’s acknowledged expert on organizational failure skewer one of contemporary science’s more embarrassing habits, but there’s more to the spectacle than a Dilbertesque joke. As Adams points out, there’s an extreme mismatch between the way that science works and the way that scientists expect their claims to be received by the general public. Within the community of researchers, the conclusions of the moment are, at least in theory, open to constant challenge—but only from within the scientific community.

The general public is not invited to take part in those challenges. Quite the contrary, it’s supposed to treat the latest authoritative pronouncement as truth pure and simple, even when that contradicts the authoritative pronouncements of six months before. Now of course there are reasons why scientists might not want to field a constant stream of suggestions and challenges from people who don’t have training in relevant disciplines, but the fact remains that expecting people to blindly accept whatever scientists say about nutrition, when scientific opinion on that subject has been whirling around like a weathercock for decades now, is not a strategy with a long shelf life. Sooner or later people start asking why they should take the latest authoritative pronouncement seriously, when so many others landed in the trash can of discarded opinions a few years further on.

There’s another, darker reason why such questions are increasingly common just now. I’m thinking here of the recent revelation that the British scientists tasked by the government with making dietary recommendations have been taking payola of various kinds from the sugar industry.  That’s hardly a new thing these days. Especially but not only in those branches of science concerned with medicine, pharmacology, and nutrition, the prostitution of the scientific process by business interests has become an open scandal. When a scientist gets behind a podium and makes a statement about the safety or efficacy of a drug, a medical treatment, or what have you, the first question asked by an ever-increasing number of people outside the scientific community these days is “Who’s paying him?”

It would be bad enough if that question was being asked because of scurrilous rumors or hostile propaganda. Unfortunately, it’s being asked because there’s nothing particularly unusual about the behavior of the British scientists mentioned above. These days, in any field where science comes into contact with serious money, scientific studies are increasingly just another dimension of marketing. From influential researchers being paid to put their names on dubious studies to give them unearned credibility to the systematic concealment of “outlying” data that doesn’t support the claims made for this or that lucrative product, the corruption of science is an ongoing reality, and one that existing safeguards within the scientific community are not effectively countering.

Scientists have by and large treated the collapse in scientific ethics as an internal matter. That’s a lethal mistake, because the view that matters here is the view from outside. What looks to insiders like a manageable problem that will sort itself out in time, looks from outside the laboratory and the faculty lounge like institutionalized corruption on the part of a self-proclaimed elite whose members cover for each other and are accountable to no one. It doesn’t matter, by the way, how inaccurate that view is in specific cases, how many honest men and women are laboring at lab benches, or how overwhelming the pressure to monetize research that’s brought to bear on scientists by university administrations and corporate sponsors: none of that finds its way into the view from outside, and in the long run, the view from outside is the one that counts..

The corruption of science by self-interest is an old story, and unfortunately it’s most intense in those fields where science impacts the lives of nonscientists most directly:  yes, those would be medicine, pharmacology, and nutrition. I mentioned in an earlier blog post here a friend whose lifelong asthma, which landed her in the hospital repeatedly and nearly killed her twice, was cured at once by removing a common allergen from her diet. Mentioning this to her physician led to the discovery that he’d known about the allergy issue all along, but as he explained, “We prefer to medicate for that.” Understandably so, as a patient who’s cured of an ailment is a good deal less lucrative for the doctor than one who has to keep on receiving regular treatments and prescriptions—but as a result of that interaction among others, the friend in question has lost most of what respect she once had for mainstream medicine, and is now learning herbalism to meet her health care needs.

It’s an increasingly common story these days, and I could add plenty of other accounts here. The point I want to make, though, is that it’s painfully obvious that the physician who preferred to medicate never thought about the view from outside. I have no way of knowing what combination of external pressures and personal failings led him to conceal a less costly cure from my friend, and keep her on expensive and ineffective drugs with a gallery of noxious side effects instead, but from outside the walls of the office, it certainly looked like a callous betrayal of whatever ethics the medical profession might still have left—and again, the view from outside is the one that counts.

It counts because institutional science only has the authority and prestige it possesses today because enough of those outside the scientific community accept its claim to speak the truth about nature. Not that many years ago, all things considered, scientists didn’t have the authority or the prestige, and no law of nature or of society guarantees that they’ll keep either one indefinitely. Every doctor who would rather medicate than cure, every researcher who treats conflicts of interest as just another detail of business as usual, every scientist who insists in angry tones that nobody without a Ph.D. in this or that discipline is entitled to ask why this week’s pronouncement should be taken any more seriously than the one it just disproved—and let’s not even talk about the increasing, and increasingly public, problem of overt scientific fraud in the pharmaceutical field among others—is hastening the day when modern science is taken no more seriously by the general public than, say, academic philosophy is today.

That day may not be all that far away. That’s the message that should be read, and is far too rarely read, in the accelerating emergence of countercultures that reject the authority of science in one field. As a recent and thoughtful essay in Slate pointed out, that crisis of authority is what gives credibility to such movements as climate denialists and “anti-vaxxers” (the growing number of parents who refuse to have their children vaccinated). A good many any people these days, when the official voices of the scientific community say this or that, respond by asking “Why should we believe you?”—and too many of them don’t get a straightforward answer that addresses their concerns.

A bit of personal experience from a different field may be relevant here. Back in the late 1980s and early 1990s, when I lived in Seattle, I put a fair amount of time into collecting local folklore concerning ghosts and other paranormal phenomena. I wasn’t doing this out of any particular belief, or for that matter any particular unbelief; I was seeking a sense of the mythic terrain of the Puget Sound region, the landscapes of belief and imagination that emerged from the experiences of people on the land, with an eye toward the career writing fiction that I then hoped to launch. While I was doing this research, when something paranormal was reported anywhere in the region, I generally got to hear about it fairly quickly, and in the process I got to watch a remarkable sequence of events that repeated itself like a broken record in more cases than I can count.

Whether the phenomenon that was witnessed was an unusual light in the sky, a seven-foot-tall hairy biped in the woods, a visit from a relative who happened to be dead at the time, or what have you, two things followed promptly once the witness went public. The first was the arrival of a self-proclaimed skeptic, usually a member of CSICOP (the Committee for Scientific Investigation of Claims of the Paranormal), who treated the witness with scorn and condescension, made dogmatic claims about what must have happened, and responded to any disagreement with bullying and verbal abuse. The other thing that followed was the arrival of an investigator from one of the local paranormal-research organizations, who was invariably friendly and supportive, listened closely to the account of the witness, and took the incident seriously. I’ll let you guess which of the proposed explanations the witness usually ended up embracing, not to mention which organization he or she often joined.

The same process on a larger and far more dangerous scale is shaping attitudes toward science across a wide and growing sector of American society. Notice that unlike climate denialism, the anti-vaxxer movement isn’t powered by billions of dollars of grant money, but it’s getting increasing traction. The reason is as simple as it is painful: parents are asking physicians and scientists, “How do I know this substance you want to put into my child is safe?”—and the answers they’re getting are not providing them with the reassurance they need.

It’s probably necessary here to point out that I’m no fan of the anti-vaxxer movement. Since epidemic diseases are likely to play a massive role in the future ahead of us, I’ve looked into anti-vaxxer arguments with some care, and they don’t convince me at all. It’s clear from the evidence that vaccines do far more often than not provide protection against dangerous diseases; while some children are harmed by the side effects of vaccination, that’s true of every medical procedure, and the toll from side effects is orders of magnitude smaller than the annual burden of deaths from these same diseases in the pre-vaccination era.

Nor does the anti-vaxxer claim that vaccines cause autism hold water. (I have Aspergers syndrome, so the subject’s of some personal interest to me.)  The epidemiology of autism spectrum disorders simply doesn’t support that claim; to my educated-layperson’s eyes, at least, it matches that of an autoimmune disease instead, complete with the rapid increase in prevalence in recent years. The hypothesis I’d be investigating now, if I’d gone into biomedical science rather than the history of ideas, is that autism spectrum disorders are sequelae of an autoimmune disease that strikes in infancy or early childhood, and causes damage to any of a variety of regions in the central nervous system—thus the baffling diversity of neurological deficits found in those of us on the autism spectrum.

Whether that’s true or not will have to be left to trained researchers. The point that I want to make here is that I don’t share the beliefs that drive the anti-vaxxer movement. Similarly, I’m sufficiently familiar with the laws of thermodynamics and the chemistry of the atmosphere to know that when the climate denialists insist that dumping billions of tons of carbon dioxide into the atmosphere can’t change its capacity to retain heat, they’re smoking their shorts.  I’ve retained enough of a childhood interest in paleontology, and studied enough of biology and genetics since then, to be able to follow the debates between evolutionary biology and so-called “creation science,” and I’m solidly on Darwin’s side of the bleachers. I could go on; I have my doubts about a few corners of contemporary scientific theory, but then so do plenty of scientists.

That is to say, I don’t agree with the anti-vaxxers, the climate denialists, the creationists, or their equivalents, but I think I understand why they’ve rejected the authority of science, and it’s not because they’re ignorant cretins, much as though the proponents and propagandists of science would like to claim that. It’s because they’ve seen far too much of the view from outside. Parents who encounter a medical industry that would rather medicate than heal are more likely to listen to anti-vaxxers; Americans who watch climate change activists demand that the rest of the world cut its carbon footprint, while the activists themselves get to keep cozy middle-class lifestyles, are more likely to believe that global warming is a politically motivated hoax; Christians who see atheists using evolution as a stalking horse for their ideology are more likely to turn to creation science—and all three, and others, are not going to listen to scientists who insist that they’re wrong, until and unless the scientists stop and take a good hard look at how they and their proclamations look when viewed from outside.

I’m far from sure that anybody in the scientific community is willing to take that hard look. It’s possible; these days, even committed atheists are starting to notice that whenever Richard Dawkins opens his mouth, twenty people who were considering atheism decide to give God a second chance. The arrogant bullying that used to be standard practice among the self-proclaimed skeptics and “angry atheists” has taken on a sullen and defensive tone recently, as though it’s started to sink in that yelling abuse at people who disagree with you might not be the best way to win their hearts and minds. Still, for that same act of reflection to get any traction in the scientific community, a great many people in that community are going to have to rethink the way they handle dealing with the public, especially when science, technology, and medicine cause harm. That, in turn, is only going to happen if enough of today’s scientists remember the importance of the view from outside.

In the light of the other issues I’ve tried to discuss over the years in this blog, that view has another dimension, and it’s a considerably harsher one. Among the outsiders whose opinion of contemporary science matters most are some that haven’t been born yet: our descendants, who will inhabit a world shaped by science and the technologies that have resulted from scientific research. It’s still popular to insist that their world will be a Star Trek fantasy of limitlessness splashed across the galaxy, but I think most people are starting to realize just how unlikely that future actually is.

Instead, the most likely futures for our descendants are those in which the burdens left behind by today’s science and technology are much more significant than the benefits.  Those most likely futures will be battered by unstable climate and rising oceans due to anthropogenic climate change, stripped of most of the world's topsoil, natural resources, and ecosystems, strewn with the radioactive and chemical trash that our era produced in such abundance and couldn’t be bothered to store safely—and most of today’s advanced technologies will have long since rusted into uselessness, because the cheap abundant energy and other nonrenewable resources that were needed to keep them running all got used up in our time.

People living in such a future aren’t likely to remember that a modest number of scientists signed petitions and wrote position papers protesting some of these things. They’re even less likely to recall the utopian daydreams of perpetual progress and limitless abundance that encouraged so many other people in the scientific community to tell themselves that these things didn’t really matter—and if by chance they do remember those daydreams, their reaction to them won’t be pretty. That science today, like every other human institution in every age, combines high ideals and petty motives in the usual proportions will not matter to them in the least.

Unless something changes sharply very soon, their view from outside may well see modern science—all of it, from the first gray dawn of the scientific revolution straight through to the flamelit midnight when the last laboratory was sacked and burned by a furious mob—as a wicked dabbling in accursed powers that eventually brought down just retribution upon a corrupt and arrogant age. So long as the proponents and propagandists of science ignore the view from outside, and blind themselves to the ways that their own defense of science is feeding the forces that are rising against it, the bleak conclusion of the Clark Ashton Smith story cited at the beginning of this post may yet turn out to be far more prophetic than the comfortable fantasies of perpetual scientific advancement cherished by so many people today.

********

On a less bleak but not wholly unrelated subject, I’m pleased to announce that my forthcoming book After Progress is rolling off the printing press as I write this. There were a few production delays, and so it’ll be next month before orders from the publisher start being shipped; the upside to this is that the book can still be purchased for 20% off the cover price. I’m pretty sure that this book will offend people straight across the spectrum of acceptable opinion in today’s industrial society, so get your copy now, pop some popcorn, and get ready to enjoy the show.

Wednesday, March 11, 2015

The Prosthetic Imagination

Two news stories and an op-ed piece in the media in recent days provide a useful introduction to the theme of this week’s post here on The Archdruid Report. The first news story followed the official announcement that the official unemployment rate here in the United States dropped to 5.5% last month. This was immediately hailed by pundits and politicians as proof that the recession we weren’t in is over at last, and the happy days that never went away are finally here again.

This jubilation makes perfect sense so long as you don’t happen to know that the official unemployment rate in the United States doesn’t actually depend on the number of people who are out of work. What it indicates is the percentage of US residents who happen to be receiving unemployment benefits—which, as I think most people know at this point, run out after a certain period. Right now there are a huge number of Americans who exhausted their unemployment benefits a long time ago, can’t find work, and would count as unemployed by any measure except the one used by the US government these days.  As far as officialdom is concerned, they are nonpersons in very nearly an Orwellian sense, their existence erased to preserve a politically expedient fiction of prosperity.

How many of these economic nonpersons are there in the United States today? That figure’s not easy to find amid the billowing statistical smokescreens. Still, it’s worth noting that 92,898,000 Americans of working age are not currently in the work force—that is, more than 37 per cent of the working age population. If you spend time around people who don’t belong to this nation’s privileged classes, you already know that a lot of those people would gladly take jobs if there were jobs to be had, but again, that’s not something that makes it through the murk.

We could spend quite a bit of time talking about the galaxy of ways in which economic statistics are finessed and/or fabricated these days, but the points already raised are enough for the present purpose. Let’s move on. The op-ed piece comes from erstwhile environmentalist Stewart Brand, whose long journey from editing CoEvolution Quarterly to channeling Bjorn Lomborg is as perfect a microcosm of the moral collapse of 20th century American environmentalism as you could hope to find. Brand’s latest piece claims that despite all evidence to the contrary—and of course there’s quite a bit of that these days—the environment is doing just fine: the economy has decoupled from resource use in recent decades, at least here in America, and so we can continue to wallow in high-tech consumer goodies without worrying about what we’re doing to the planet.

There’s a savage irony in the fact that in 1975, when his magazine was the go-to place to read about the latest ideas in systems theory and environmental science, Brand could have pointed out the gaping flaw in that argument in a Sausalito minute. Increasing prosperity in the United States has “decoupled” from resource use for two reasons: first, only a narrowing circle of privileged Americans get to see any of the paper prosperity we’re discussing—the standard of living for most people in this country has been contracting steadily for four decades—and second, the majority of consumer goods used in the United States are produced overseas, and so the resource use and environmental devastation involved in manufacturing the goodies we consume so freely takes place somewhere else.

That is to say, what Brand likes to call decoupling is our old friend, the mass production of ecological externalities. Brand can boast about prosperity without environmental cost because the great majority of the costs are being carried by somebody else, somewhere else, and so don’t find their way into his calculations.  The poor American neighborhoods where people struggle to get by without jobs are as absent from his vision of the world as they are from the official statistics; the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor that prop up his high-tech lifestyle are overseas, so they don’t show up on US statistics either. As far as Brand is concerned, that means they don’t count.

We could talk more about the process by which a man who first became famous for pressuring NASA into releasing a photo of the whole earth is now insisting that the only view that matters is the one from his living room window, but let’s go on. The other news item is the simplest and, in a bleak sort of way, the funniest of the lot.  According to recent reports, state government officials in Florida are being forbidden from using the phrase “climate change” when discussing the effects of, whisper it, climate change.

This is all the more mordantly funny because Florida is on the front lines of climate change right now.  Even the very modest increases in sea level we’ve seen so far, driven by thermal expansion and the first rounds of Greenland and Antarctic meltwater, are sending seawater rushing out of the storm sewers into the streets of low-lying parts of coastal Florida towns whenever the tide is high and an onshore wind blows hard enough. As climate change accelerates—and despite denialist handwaving, it does seem to be doing that just now—a lot of expensive waterfront property in Florida is going to end up underwater in more than a financial sense.  The state government’s response to this clear and present danger? Prevent state officials from talking about it.

We could look at a range of other examples of this same kind, but these three will do for now. What I want to discuss now is what’s going on here, and what it implies.

Let’s begin with the obvious. In all three of the cases I’ve cited, an uncomfortable reality is being dismissed by manipulating abstractions. An abstraction called “the unemployment rate” has been defined so that the politicians and bureaucrats who cite it don’t have to deal with just how many Americans these days can’t get paid employment; an abstraction called “decoupling” and a range of equally abstract (and cherrypicked) measures of environmental health are being deployed so that Brand and his readers don’t have to confront the soaring ecological costs of computer technology in particular and industrial society in general; an abstraction called “climate change,” finally, is being banned from use by state officials because it does too good a job of connecting certain dots that, for political reasons, Florida politicians don’t want people to connect.

To a very real extent, this sort of thing is pervasive in human interaction, and has been since the hoots and grunts of hominin vocalization first linked up with a few crude generalizations in the dazzled mind of an eccentric australopithecine. Human beings everywhere use abstract categories and the words that denote them as handles by which to grab hold of unruly bundles of experience. We do it far more often, and far more automatically, than most of us ever notice.  It’s only under special circumstances—waking up at night in an unfamiliar room, for example, and finding that the vague somethings around us take a noticeable amount of time to coalesce into ordinary furniture—that the mind’s role in assembling the fragmentary data of sensation into the objects of our experience comes to light.

When you look at a tree, for example, it’s common sense to think that the tree is sitting out there, and your eyes and mind are just passively receiving a picture of it—but then it’s common sense to think that the sun revolves around the earth. In fact, as philosophers and researchers into the psychophysics of sensation both showed a long time ago, what happens is that you get a flurry of fragmentary sense data—green, brown, line, shape, high contrast, low contrast—and your mind constructs a tree out of it, using its own tree-concept (as well as a flurry of related concepts such as “leaf,” “branch,” “bark,” and so on) as a template. You do that with everything you see, and the reason you don’t notice it is that it was the very first thing you learned how to do, as a newborn infant, and you’ve practiced it so often you don’t have to think about it any more.

You do the same thing with every representation of a sensory object. Let’s take visual art for an example.  Back in the 1880s, when the Impressionists first started displaying their paintings, it took many people a real effort to learn how to look at them, and a great many never managed the trick at all. Among those who did, though, it was quite common to hear comments about how this or that painting had taught them to see a landscape, or what have you, in a completely different way. That wasn’t just hyperbole:  the Impressionists had learned how to look at things in a way that brought out features of their subjects that other people in late 19th century Europe and America had never gotten around to noticing, and highlighted those things in their paintings so forcefully that the viewer had to notice them.

The relation between words and the things they denote is thus much more complex, and much more subjective, than most people ever quite get around to realizing. That’s challenging enough when we’re talking about objects of immediate experience, where the concept in the observer’s mind has the job of fitting fragmentary sense data into a pattern that can be verified by other forms of sense data—in the example of the tree, by walking up to it and confirming by touch that the trunk is in fact where the sense of sight said it was. It gets far more difficult when the raw material that’s being assembled by the mind consists of concepts rather than sensory data: when, let’s say, you move away from your neighbor Joe, who can’t find a job and is about to lose his house, start thinking about all the people in town who are in a similar predicament, and end up dealing with abstract concepts such as unemployment, poverty, the distribution of wealth, and so on.

Difficult or not, we all do this, all the time. There’s a common notion that dealing in abstractions is the hallmark of the intellectual, but that puts things almost exactly backwards; it’s the ordinary unreflective person who thinks in abstractions most of the time, while the thinker’s task is to work back from the abstract category to the raw sensory data on which it’s based. That’s what the Impressionists did:  staring at a snowbank as Monet did, until he could see the rainbow play of colors behind the surface impression of featureless white, and then painting the colors into the representation of the snowbank so that the viewer was shaken out of the trance of abstraction (“snow” = “white”) and saw the colors too—first in the painting, and then when looking at actual snow.

Human thinking, and human culture, thus dance constantly between the concrete and the abstract, or to use a slightly different terminology, between immediate experience and a galaxy of forms that reflect experience back in mediated form. It’s a delicate balance: too far into the immediate and experience disintegrates into fragmentary sensation; too far from the immediate and experience vanishes into an echo chamber of abstractions mediating one another. The most successful and enduring creations of human culture have tended to be those that maintain the balance. Representational painting is one of those; another is literature. Read the following passage closely:

“Eastward the Barrow-downs rose, ridge behind ridge into the morning, and vanished out of eyesight into a guess: it was no more than a guess of blue and a remote white glimmer blending with the hem of the sky, but it spoke to them, out of memory and old tales, of the high and distant mountains.”

By the time you finished reading it, you likely had a very clear sense of what Frodo Baggins and his friends were seeing as they looked off to the east from the hilltop behind Tom Bombadil’s house. So did I, as I copied the sentence, and so do most people who read that passage—but no two people see the same image, because the image each of us sees is compounded out of bits of our own remembered experiences. For me, the image that comes to mind has always drawn heavily on the view eastwards from the suburban Seattle neighborhoods where I grew up, across the rumpled landscape to the stark white-topped rampart of the Cascade Mountains. I know for a fact that that wasn’t the view that Tolkien himself had in mind when he penned that sentence; I suspect he was thinking of the view across the West Midlands toward the Welsh mountains, which I’ve never seen; and I wonder what it must be like for someone to read that passage whose concept of ridges and mountains draws on childhood memories of the Urals, the Andes, or Australia’s Great Dividing Range instead.

That’s one of the ways that literature takes the reader through the mediation of words back around to immediate experience. If I ever do have the chance to stand on a hill in the West Midlands and look off toward the Welsh mountains, Tolkien’s words are going to be there with me, pointing me toward certain aspects of the view I might not otherwise have noticed, just as they did in my childhood. It’s the same trick the Impressionists managed with a different medium: stretching the possibilities of experience by representing (literally re-presenting) the immediate in a mediated form.

Now think about what happens when that same process is hijacked, using modern technology, for the purpose of behavioral control.

That’s what advertising does, and more generally what the mass media do. Think about the fast food company that markets its product under the slogan “I’m loving it,” complete with all those images of people sighing with post-orgasmic bliss as they ingest some artificially flavored and colored gobbet of processed pseudofood. Are they loving it? Of course not; they’re hack actors being paid to go through the motions of loving it, so that the imagery can be drummed into your brain and drown out your own recollection of the experience of not loving it. The goal of the operation is to keep you away from immediate experience, so that a deliberately distorted mediation can be put in its place.

You can do that with literature and painting, by the way. You can do it with any form of mediation, but it’s a great deal more effective with modern visual media, because those latter short-circuit the journey back to immediate experience. You see the person leaning back with the sigh of bliss after he takes a bite of pasty bland bun and tasteless gray mystery-meat patty, and you see it over and over and over again. If you’re like most Americans, and spend four or five hours a day staring blankly at little colored images on a glass screen, a very large fraction of your total experience of the world consists of this sort of thing: distorted imitations of immediate experience, intended to get you to think about the world in ways that immediate experience won’t justify.

The externalization of the human mind and imagination via the modern mass media has no shortage of problematic features, but the one I want totalk about here is the way that it feeds into the behavior discussed at the beginning of this post: the habit, pervasive in modern industrial societies just now, of responding to serious crises by manipulating abstractions to make them invisible. That kind of thing is commonplace in civilizations on their way out history’s exit door, for reasons I’ve discussed in an earlier sequence of posts here, but modern visual media make it an even greater problem in the present instance. These latter function as a prosthetic for the imagination, a device for replacing the normal image-making functions of the human mind with electromechanical equivalents. What’s more, you don’t control the prosthetic imagination; governments and corporations control it, and use it to shape your thoughts and behavior in ways that aren’t necessarily in your best interests.

The impact on the prosthetic imagination on the crisis of our time is almost impossible to overstate. I wonder, for example, how many of my readers have noticed just how pervasive references to science fiction movies and TV shows have become in discussions of the future of technology. My favorite example just now is the replicator, a convenient gimmick from the Star Trek universe: you walk up to it and order something, and the replicator pops it into being out of nothing.

It’s hard to think of a better metaphor for the way that people in the privileged classes of today’s industrial societies like to think of the consumer economy. It’s also hard to think of anything that’s further removed from the realities of the consumer economy. The replicator is the ultimate wet dream of externalization: it has no supply chains, no factories, no smokestacks, no toxic wastes, just whatever product you want any time you happen to want it. That’s exactly the kind of thinking that lies behind Stewart Brand’s fantasy of “decoupling”—and it’s probably no accident that more often than not, when I’ve had conversations with people who think that 3-D printers are the solution to everything, they bring Star Trek replicators into the discussion.

3-D printers are not replicators. Their supply chains and manufacturing costs include the smokestacks, outflow pipes, toxic-waste dumps, sweatshopped factories, and open-pit mines worked by slave labor mentioned earlier, and the social impacts of their widespread adoption would include another wave of mass technological unemployment—remember, it’s only in the highly mediated world of current economic propaganda that people who lose their jobs due to automation automatically get new jobs in some other field; in the immediate world, that’s become increasingly uncommon. As long as people look at 3-D printers through minds full of little pictures of Star Trek replicators, though, those externalized ecological and social costs are going to be invisible to them.

That, in turn, defines the problem with the externalization of the human mind and imagination: no matter how frantically you manipulate abstractions, the immediate world is still what it is, and it can still clobber you. Externalizing a cost doesn’t make it go away; it just guarantees that you won’t see it in time to do anything but suffer the head-on impact.

Wednesday, March 04, 2015

Peak Meaninglessness

Last week’s discussion of externalities—costs of doing business that get dumped onto the economy, the community, or the environment, so that those doing the dumping can make a bigger profit—is, I’m glad to say, not the first time this issue has been raised recently.  The long silence that closed around such things three decades ago is finally cracking; they’re being mentioned again, and not just by archdruids.  One of my readers—tip of the archdruidical hat to Joe McInerney—noted an article in Grist a while back that pointed out the awkward fact that none of the twenty biggest industries in today’s world could break even, much less make a profit, if they had to pay for the damage they do to the environment.

Now of course the conventional wisdom these days interprets that statement to mean that it’s unfair to make those industries pay for the costs they impose on the rest of us—after all, they have a God-given right to profit at everyone else’s expense, right?  That’s certainly the attitude of fracking firms in North Dakota, who recently proposed that  they ought to be exempted from the state’s rules on dumping radioactive waste, because following the rules would cost them too much money. That the costs externalized by the fracking industry will sooner or later be paid by others, as radionuclides in fracking waste work their way up the food chain and start producing cancer clusters, is of course not something anyone in the industry or the media is interested in discussing.

Watch this sort of thing, and you can see the chasm opening up under the foundations of industrial society. Externalized costs don’t just go away; one way or another, they’re going to be paid, and costs that don’t appear on a company’s balance sheet still affect the economy. That’s the argument of The Limits to Growth, still the most accurate (and thus inevitably the most reviled) of the studies that tried unavailingly to turn industrial society away from its suicidal path: on a finite planet, once an inflection point is passed, the costs of economic growth rise faster than growth does, and sooner or later force the global economy to its knees.

The tricks of accounting that let corporations pretend that their externalized costs vanish into thin air don’t change that bleak prognosis. Quite the contrary, the pretense that externalities don’t matter just makes it harder for a society in crisis to recognize the actual source of its troubles. I’ve come to think that that’s the unmentioned context behind a dispute currently roiling those unhallowed regions where economists lurk in the shrubbery: the debate over secular stagnation.

Secular stagnation? That’s the concept, unmentionable until recently, that the global economy could stumble into a rut of slow, no, or negative growth, and stay there for years. There are still plenty of economists who insist that this can’t happen, which is rather funny, really, when you consider that this has basically been the state of the global economy since 2009. (My back-of-the-envelope calculations suggest, in fact, that if you subtract the hallucinatory paper wealth manufactured by derivatives and similar forms of financial gamesmanship from the world’s GDP, the production of nonfinancial goods and services worldwide has actually been declining since before the 2008 housing crash.)

Even among those who admit that what’s happening can indeed happen, there’s no consensus as to how or why such a thing could occur.  On the off chance that any mainstream economists are lurking in the shrubbery in the even more unhallowed regions where archdruids utter unspeakable heresies, and green wizards clink mugs of homebrewed beer together and bay at the moon, I have a suggestion to offer: the most important cause of secular stagnation is the increasing impact of externalities on the economy. The dishonest macroeconomic bookkeeping that leads economists to think that externalized costs go away because they’re not entered into anyone’s ledger books doesn’t actually make them disappear; instead, they become an unrecognized burden on the economy as a whole, an unfelt headwind blowing with hurricane force in the face of economic growth.

Thus there’s a profound irony in the insistence by North Dakota fracking firms that they ought to be allowed to externalize even more of their costs in order to maintain their profit margin. If I’m right, the buildup of externalized costs is what’s causing the ongoing slowdown in economic activity worldwide that’s driving down commodity prices, forcing interest rates in many countries to zero or below, and resurrecting the specter of deflationary depression. The fracking firms in question thus want to respond to the collapse in oil prices—a result of secular stagnation—by doing even more of what’s causing secular stagnation. To say that this isn’t likely to end well is to understate the case considerably.

In the real world, of course, mainstream economists don’t listen to suggestions from archdruids, and fracking firms, like every other business concern these days, can be expected to put their short-term cash flow ahead of the survival of their industry, or for that matter of industrial civilization as a whole. Thus I propose to step aside from the subject of economic externalities for a moment—though I’ll be returning to it at intervals as we proceed with this sequence of posts—in order to discuss a subtler and less crassly financial form of the same phenomenon.

That form came in for discussion in the same post two weeks ago that brought the issue of externalities into this blog’s ongoing conversation. Quite a few readers commented about the many ways in which things labeled “more advanced,” “more progressive,” and the like were actually less satisfactory and less effective at meeting human needs than the allegedly more primitive technologies they replaced. Some of those comments focused, and quite sensibly, on the concrete examples, but others pondered the ways that today’s technology fails systematically at meeting certain human needs, and reflected on the underlying causes for that failure. One of my readers—tip of the archdruidical hat here to Ruben—gave an elegant frame for that discussion by suggesting that the peak of technological complexity in our time may also be described as peak meaninglessness.

I’d like to take the time to unpack that phrase. In the most general sense, technologies can be divided into two broad classes, which we can respectively call tools and prosthetics. The difference is a matter of function. A tool expands human potential, giving people the ability to do things they couldn’t otherwise do. A prosthetic, on the other hand, replaces human potential, doing something that under normal circumstances, people can do just as well for themselves.  Most discussions of technology these days focus on tools, but the vast majority of technologies that shape the lives of people in a modern industrial society are not tools but prosthetics.

Prosthetics have a definite value, to be sure. Consider an artificial limb, the sort of thing on which the concept of technology-as-prosthetic is modeled. If you’ve lost a leg in an accident, say, an artificial leg is well worth having; it replaces a part of ordinary human potential that you don’t happen to have any more, and enables you to do things that other people can do with their own leg. Imagine, though, that some clever marketer were to convince people to have their legs cut off so that they could be fitted for artificial legs. Imagine, furthermore, that the advertising for artificial legs became so pervasive, and so successful, that nearly everybody became convinced that human legs were hopelessly old-fashioned and ugly, and rushed out to get their legs amputated so they could walk around on artificial legs.

Then, of course, the manufacturers of artificial arms got into the same sort of marketing, followed by the makers of sex toys. Before long you’d have a society in which most people were gelded quadruple amputees fitted with artificial limbs and rubber genitals, who spent all their time talking about the wonderful things they could do with their prostheses. Only in the darkest hours of the night, when the TV was turned off, might some of them wonder why it was that a certain hard-to-define numbness had crept into all their interactions with other people and the rest of the world.

In a very real sense, that’s the way modern industrial society has reshaped and deformed human life for its more privileged inmates. Take any human activity, however humble or profound, and some clever marketer has found a way to insert a piece of technology in between the person and the activity. You can’t simply bake bread—a simple, homely, pleasant activity that people have done themselves for thousands of years using their hands and a few simple handmade tools; no, you have to have a bread machine, into which you dump a prepackaged mix and some liquid, push a button, and stand there being bored while it does the work for you, if you don’t farm out the task entirely to a bakery and get the half-stale industrially extruded product that passes for bread these days.

Now of course the bread machine manufacturers and the bakeries pitch their products to the clueless masses by insisting that nobody has time to bake their own bread any more. Ivan Illich pointed out in Energy and Equity a long time ago the logical fallacy here, which is that using a bread machine or buying from a bakery is only faster if you don’t count the time you have to spend earning the money needed to pay for it, power it, provide it with overpriced prepackaged mixes, repair it, clean it, etc., etc., etc. Illich’s discussion focused on automobiles; he pointed out that if you take the distance traveled by the average American auto in a year, and divide that by the total amount of time spent earning the money to pay for the auto, fuel, maintenance, insurance, etc., plus all the other time eaten up by tending to the auto in various ways, the average American car goes about 3.5 miles an hour: about the same pace, that is, that an ordinary human being can walk.

If this seems somehow reminiscent of last week’s discussion of externalities, dear reader, it should. The claim that technology saves time and labor only seems to make sense if you ignore a whole series of externalities—in this case, the time you have to put into earning the money to pay for the technology and into coping with whatever requirements, maintenance needs, and side effects the technology has. Have you ever noticed that the more “time-saving technologies” you bring into your life, the less free time you have? This is why—and it’s also why the average medieval peasant worked shorter hours, had more days off, and kept a larger fraction of the value of his labor than you do.

Something else is being externalized by prosthetic technology, though, and it’s that additional factor that gives Ruben’s phrase “peak meaninglessness” its punch. What are you doing, really, when you use a bread machine? You’re not baking bread; the machine is doing that. You’re dumping a prepackaged mix and some water into a machine, closing the lid, pushing a button, and going away to do something else. Fair enough—but what is this “something else” that you’re doing? In today’s industrial societies, odds are you’re going to go use another piece of prosthetic technology, which means that once again, you’re not actually doing anything. A machine is doing something for you. You can push that button and walk away, but again, what are you going to do with your time? Use another machine?

The machines that industrial society uses to give this infinite regress somewhere to stop—televisions, video games, and computers hooked up to the internet—simply take the same process to its ultimate extreme. Whatever you think you’re doing when you’re sitting in front of one of these things, what you’re actually doing is staring at little colored pictures on a glass screen and pushing some buttons. All things considered, this is a profoundly boring activity, which is why the little colored pictures jump around all the time; that’s to keep your nervous system so far off balance that you don’t notice just how tedious it is to spend hours at a time staring at little colored pictures on a screen.

I can’t help but laugh when people insist that the internet is an information-rich environment. It’s quite the opposite, actually: all you get from it is the very narrow trickle of verbal, visual, and auditory information that can squeeze through the digital bottleneck and turn into little colored pictures on a glass screen. The best way to experience this is to engage in a media fast—a period in which you deliberately cut yourself off from all electronic media for a week or more, preferably in a quiet natural environment. If you do that, you’ll find that it can take two or three days, or even more, before your numbed and dazzled nervous system recovers far enough that you can begin to tap in to the ocean of sensory information and sensual delight that surrounds you at every moment. It’s only then, furthermore, that you can start to think your own thoughts and dream your own dreams, instead of just rehashing whatever the little colored pictures tell you.

A movement of radical French philosophers back in the 1960s, the Situationists, argued that modern industrial society is basically a scheme to convince people to hand over their own human capabilities to the industrial machine, so that imitations of those capabilities can be sold back to them at premium prices. It was a useful analysis then, and it’s even more useful now, when the gap between realities and representations has become even more drastic than it was back then. These days, as often as not, what gets sold to people isn’t even an imitation of some human capability, but an abstract representation of it, an arbitrary marker with only the most symbolic connection to what it represents.

This is one of the reasons why I think it’s deeply mistaken to claim that Americans are materialistic. Americans are arguably the least materialistic people in the world; no actual materialist—no one who had the least appreciation for actual physical matter and its sensory and sensuous qualities—could stand the vile plastic tackiness of America’s built environment and consumer economy for a fraction of a second.  Americans don’t care in the least about matter; they’re happy to buy even the most ugly, uncomfortable, shoddily made and absurdly overpriced consumer products you care to imagine, so long as they’ve been convinced that having those products symbolizes some abstract quality they want, such as happiness, freedom, sexual pleasure, or what have you.

Then they wonder, in the darkest hours of the night, why all the things that are supposed to make them happy and satisfied somehow never manage to do anything of the kind. Of course there’s a reason for that, too, which is that happy and satisfied people don’t keep on frantically buying products in a quest for happiness and satisfaction. Still, the little colored pictures keep showing them images of people who are happy and satisfied because they guzzle the right brand of tasteless fizzy sugar water, and pay for the right brand of shoddily made half-disposable clothing, and keep watching the little colored pictures: that last above all else. “Tune in tomorrow” is the most important product that every media outlet sells, and they push it every minute of every day on every stop and key.

That is to say, between my fantasy of voluntary amputees eagerly handing over the cash for the latest models of prosthetic limbs, and the reality of life in a modern industrial society, the difference is simply in the less permanent nature of the alterations imposed on people here and now.  It’s easier to talk people into amputating their imaginations than it is to convince them to amputate their limbs, but it’s also a good deal easier to reverse the surgery.

What gives this even more importance than it would otherwise have, in turn, is that all this is happening in a society that’s hopelessly out of touch with the realities that support its existence, and that relies on bookkeeping tricks of the sort discussed toward the beginning of this essay to maintain the fantasy that it’s headed somewhere other than history’s well-used compost bin. The externalization of the mind and the imagination plays just as important a role in maintaining that fantasy as the externalization of costs—and the cold mechanical heart of the externalization of the mind and imagination is mediation, the insertion of technological prosthetics into the space between the individual and the world. We’ll talk more about that in next week’s post.

****************
In other news, I’m delighted to report the publication of a new book of mine that may be of particular interest to readers of this blog: Collapse Now and Avoid the Rush: The Best of the Archdruid Report, which is just out from Founders House Publishing. As the title suggests, it’s an anthology of twenty-five of the most popular weekly posts from this blog, including such favorites as "Knowing Only One Story," "An Elegy for the Age of Space," "The Next Ten Billion Years," and "The Time of the Seedbearers," as well as the title essay and many more. These are the one-of-a-kind essays that haven’t appeared in my books; if you’re looking for something to hand to the spouse or friend or twelve-year-old kid who wants to know why you keep visiting this sight every Wednesday night, or simply want this blog’s best essays in a more permanent form, this is the book. It’s available in print and e-book formats and can be ordered here.

Wednesday, February 25, 2015

The Externality Trap, or, How Progress Commits Suicide

I've commented more than once in these essays about the cooperative dimension of writing:  the way that even the most solitary of writers inevitably takes part in what Mortimer Adler used to call the Great Conversation, the flow of ideas and insights across the centuries that’s responsible for most of what we call culture. Sometimes that conversation takes place second- or third-hand—for example, when ideas from two old books collide in an author’s mind and give rise to a third book, which will eventually carry the fusion to someone else further down the stream of time—but sometimes it’s far more direct.

Last week’s post here brought an example of the latter kind. My attempt to cut through the ambiguities surrounding that slippery word “progress” sparked a lively discussion on the comments page of my blog about just exactly what counted as progress, what factors made one change “progressive” while another was denied that label. In the midst of it all, one of my readers—tip of the archdruidical hat to Jonathan—proposed an unexpected definition:  what makes a change qualify as progress, he suggested, is that it increases the externalization of costs. 

I’ve been thinking about that definition since Jonathan proposed it, and it seems to me that it points up a crucial and mostly unrecognized dimension of the crisis of our time. To make sense of it, though, it’s going to be necessary to delve briefly into economic jargon.

Economists use the term “externalities” to refer to the costs of an economic activity that aren’t paid by either party in an exchange, but are pushed off onto somebody else. You won’t hear a lot of talk about externalities these days; it many circles, it’s considered impolite to mention them, but they’re a pervasive presence in contemporary life, and play a very large role in some of the most intractable problems of our age. Some of those problems were discussed by Garret Hardin in his famous essay on the tragedy of the commons, and more recently by Elinor Ostrom in her studies of how that tragedy can be avoided; still, I’m not sure how often it’s recognized that the phenomena they discussed applies not just to commons systems, but to societies as a whole—especially to societies like ours.

An example may be useful here. Let’s imagine a blivet factory, which turns out three-prong, two-slot blivets in pallet loads for customers. The blivet-making process, like manufacturing of every other kind, produces waste as well as blivets, and we’ll assume for the sake of the example that blivet waste is moderately toxic and causes health problems in people who ingest it. The blivet factory produces one barrel of blivet waste for every pallet load of blivets it ships. The cheapest option for dealing with the waste, and thus the option that economists favor, is to dump it into the river that flows past the factory.

Notice what happens as a result of this choice. The blivet manufacturer has maximized his own benefit from the manufacturing process, by avoiding the expense of finding some other way to deal with all those barrels of blivet waste. His customers also benefit, because blivets cost less than they would if the cost of waste disposal was factored into the price. On the other hand, the costs of dealing with the blivet waste don’t vanish like so much twinkle dust; they are imposed on the people downstream who get their drinking water from the river, or from aquifers that receive water from the river, and who suffer from health problems because there’s blivet waste in their water. The blivet manufacturer is externalizing the cost of waste disposal; his increased profits are being paid for at a remove by the increased health care costs of everyone downstream.

That’s how externalities work. Back in the days when people actually talked about the downsides of economic growth, there was a lot of discussion of how to handle externalities, and not just on the leftward end of the spectrum.  I recall a thoughtful book titled TANSTAAFL—that’s an acronym, for those who don’t know their Heinlein, for “There Ain’t No Such Thing As A Free Lunch”—which argued, on solid libertarian-conservative grounds, that the environment could best be preserved by making sure that everyone paid full sticker price for the externalities they generated. Today’s crop of pseudoconservatives, of course, turned their back on all this a long time ago, and insist at the top of their lungs on their allegedly God-given right to externalize as many costs as they possibly can.  This is all the more ironic in that most pseudoconservatives claim to worship a God who said some very specific things about “what ye do to the least of these,” but that’s a subject for a different post.

Economic life in the industrial world these days can be described, without too much inaccuracy, as an arrangement set up to allow a privileged minority to externalize nearly all their costs onto the rest of society while pocketing as much as possible the benefits themselves. That’s come in for a certain amount of discussion in recent years, but I’m not sure how many of the people who’ve participated in those discussions have given any thought to the role that technological progress plays in facilitating the internalization of benefits and the externalization of costs that drive today’s increasingly inegalitarian societies. Here again, an example will be helpful.

Before the invention of blivet-making machinery, let’s say, blivets were made by old-fashioned blivet makers, who hammered them out on iron blivet anvils in shops that were to be found in every town and village. Like other handicrafts, blivet-making was a living rather than a ticket to wealth; blivet makers invested their own time and muscular effort in their craft, and turned out enough in the way of blivets to meet the demand. Notice also the effect on the production of blivet waste. Since blivets were being made one at a time rather than in pallet loads, the total amount of waste was smaller; the conditions of handicraft production also meant that blivet makers and their families were more likely to be exposed to the blivet waste than anyone else, and so had an incentive to invest the extra effort and expense to dispose of it properly. Since blivet makers were ordinary craftspeople rather than millionaires, furthermore, they weren’t as likely to be able to buy exemption from local health laws.

The invention of the mechanical blivet press changed that picture completely.  Since one blivet press could do as much work as fifty blivet makers, the income that would have gone to those fifty blivet makers and their families went instead to one factory owner and his stockholders, with as small a share as possible set aside for the wage laborers who operate the blivet press. The factory owner and stockholders had no incentive to pay for the proper disposal of the blivet waste, either—quite the contrary, since having to meet the disposal costs cut into their profit, buying off local governments was much cheaper, and if the harmful effects of blivet waste were known, you can bet that the owner and shareholders all lived well upstream from the factory. 

Notice also that a blivet manufacturer who paid a living wage to his workers and covered the costs of proper waste disposal would have to charge a higher price for blivets than one who did neither, and thus would be driven out of business by his more ruthless competitor. Externalities aren’t simply made possible by technological progress, in other words; they’re the inevitable result of technological progress in a market economy, because externalizing the costs of production is in most cases the most effective way to outcompete rival firms, and the firm that succeeds in externalizing the largest share of its costs is the most likely to prosper and survive.

Each further step in the progress of blivet manufacturing, in turn, tightened the same screw another turn. Today, to finish up the metaphor, the entire global supply of blivets is made in a dozen factories in  distant Slobbovia, where sweatshop labor under ghastly working conditions and the utter absence of environmental regulations make the business of blivet fabrication more profitable than anywhere else. The blivets are as shoddily made as possible; the entire blivet supply chain from the open-pit mines worked by slave labor that provide the raw materials to the big box stores with part-time, poorly paid staff selling blivetronic technology to the masses is a human and environmental disaster.  Every possible cost has been externalized, so that the two multinational corporations that dominate the global blivet industry can maintain their profit margins and pay absurdly high salaries to their CEOs.

That in itself is bad enough, but let’s broaden the focus to include the whole systems in which blivet fabrication takes place: the economy as a whole, society as a whole, and the biosphere as a whole. The impact of technology on blivet fabrication in a market economy has predictable and well understood consequences for each of these whole systems, which can be summed up precisely in the language we’ve already used. In order to maximize its own profitability and return on shareholder investment, the blivet industry externalizes costs in every available direction. Since nobody else wants to bear those costs, either, most of them end up being passed onto the whole systems just named, because the economy, society, and the biosphere have no voice in today’s economic decisions.

Like the costs of dealing with blivet waste, though, the other externalized costs of blivet manufacture don’t go away just because they’re externalized. As externalities increase, they tend to degrade the whole systems onto which they’re dumped—the economy, society, and the biosphere. This is where the trap closes tight, because blivet manufacturing exists within those whole systems, and can’t be carried out unless all three systems are sufficiently intact to function in their usual way. As those systems degrade, their ability to function degrades also, and eventually one or more of them breaks down—the economy plunges into a depression, the society disintegrates into anarchy or totalitarianism, the biosphere shifts abruptly into a new mode that lacks adequate rainfall for crops—and the manufacture of blivets stops because the whole system that once supported it has stopped doing so.

Notice how this works out from the perspective of someone who’s benefiting from the externalization of costs by the blivet industry—the executives and stockholders in a blivet corporation, let’s say. As far as they’re concerned, until very late in the process, everything is fine and dandy: each new round of technological improvements in blivet fabrication increases their profits, and if each such step in the onward march of progress also means that working class jobs are eliminated or offshored, democratic institutions implode, toxic waste builds up in the food chain, or what have you, hey, that’s not their problem—and after all, that’s just the normal creative destruction of capitalism, right?

That sort of insouciance is easy for at least three reasons. First, the impacts of externalities on whole systems can pop up a very long way from the blivet factories.  Second, in a market economy, everyone else is externalizing their costs as enthusiastically as the blivet industry, and so it’s easy for blivet manufacturers (and everyone else) to insist that whatever’s going wrong is not their fault.  Third, and most crucially, whole systems as stable and enduring as economies, societies, and biospheres can absorb a lot of damage before they tip over into instability. The process of externalization of costs can thus run for a very long time, and become entrenched as a basic economic habit, long before it becomes clear to anyone that continuing along the same route is a recipe for disaster.

Even when externalized costs have begun to take a visible toll on the economy, society, and the biosphere, furthermore, any attempt to reverse course faces nearly insurmountable obstacles. Those who profit from the existing order of things can be counted on to fight tooth and nail for the right to keep externalizing their costs: after all, they have to pay the full price for any reduction in their ability to externalize costs, while the benefits created by not imposing those costs on whole systems are shared among all participants in the economy, society, and the biosphere respectively. Nor is it necessarily easy to trace back the causes of any given whole-system disruption to specific externalities benefiting specific people or industries. It’s rather like loading hanging weights onto a chain; sooner or later, as the amount of weight hung on the chain goes up, the chain is going to break, but the link that breaks may be far from the last weight that pushed things over the edge, and every other weight on  the chain made its own contribution to the end result

A society that’s approaching collapse because too many externalized costs have been loaded onto on the whole systems that support it thus shows certain highly distinctive symptoms. Things are going wrong with the economy, society, and the biosphere, but nobody seems to be able to figure out why; the measurements economists use to determine prosperity show contradictory results, with those that measure the profitability of individual corporations and industries giving much better readings those that measure the performance of whole systems; the rich are convinced that everything is fine, while outside the narrowing circles of wealth and privilege, people talk in low voices about the rising spiral of problems that beset them from every side. If this doesn’t sound familiar to you, dear reader, you probably need to get out more.

At this point it may be helpful to sum up the argument I’ve developed here:

a) Every increase in technological complexity tends also to increase the opportunities for externalizing the costs of economic activity;

b) Market forces make the externalization of costs mandatory rather than optional, since economic actors that fail to externalize costs will tend to be outcompeted by those that do;

c) In a market economy, as all economic actors attempt to externalize as many costs as possible, externalized costs will tend to be passed on preferentially and progressively to whole systems such as the economy, society, and the biosphere, which provide necessary support for economic activity but have no voice in economic decisions;

d) Given unlimited increases in technological complexity, there is no necessary limit to the loading of externalized costs onto whole systems short of systemic collapse;

e) Unlimited increases in technological complexity in a market economy thus necessarily lead to the progressive degradation of the whole systems that support economic activity;

f) Technological progress in a market economy  is therefore self-terminating, and ends in collapse.

Now of course there are plenty of arguments that could be deployed against this modest proposal. For example, it could be argued that progress doesn’t have to generate a rising tide of externalities. The difficulty with this argument is that externalization of costs isn’t an accidental side effect of technology but an essential aspect—it’s not a bug, it’s a feature. Every technology is a means of externalizing some cost that would otherwise be borne by a human body. Even something as simple as a hammer takes the wear and tear that would otherwise affect the heel of your hand, let’s say, and transfers it to something else: directly, to the hammer; indirectly, to the biosphere, by way of the trees that had to be cut down to make the charcoal to smelt the iron, the plants that were shoveled aside to get the ore, and so on.

For reasons that are ultimately thermodynamic in nature, the more complex a technology becomes, the more costs it generates. In order to outcompete a simpler technology, each more complex technology has to externalize a significant proportion of its additional costs, in order to compete against the simpler technology. In the case of such contemporary hypercomplex technosystems as the internet, the process of externalizing costs has gone so far, through so many tangled interrelationships, that it’s remarkably difficult to figure out exactly who’s paying for how much of the gargantuan inputs needed to keep the thing running. This lack of transparency feeds the illusion that large systems are cheaper than small ones, by making externalities of scale look like economies of scale.

It might be argued instead that a sufficiently stringent regulatory environment, forcing economic actors to absorb all the costs of their activities instead of externalizing them onto others, would be able to stop the degradation of whole systems while still allowing technological progress to continue. The difficulty here is that increased externalization of costs is what makes progress profitable. As just noted, all other things being equal, a complex technology will on average be more expensive in real terms than a simpler technology, for the simple fact that each additional increment of complexity has to be paid for by an investment of energy and other forms of real capital.

Strip complex technologies of the subsidies that transfer some of their costs to the government, the perverse regulations that transfer some of their costs to the rest of the economy, the bad habits of environmental abuse and neglect that transfer some of their costs to the biosphere, and so on, and pretty soon you’re looking at hard economic limits to technological complexity, as people forced to pay the full sticker price for complex technologies maximize their benefits by choosing simpler, more affordable options instead. A regulatory environment sufficiently strict to keep technology from accelerating to collapse would thus bring technological progress to a halt by making it unprofitable.

Notice, however, the flipside of the same argument: a society that chose to stop progressing technologically could maintain itself indefinitely, so long as its technologies weren’t dependent on nonrenewable resources or the like. The costs imposed by a stable technology on the economy, society, and the biosphere would be more or less stable, rather than increasing over time, and it would therefore be much easier to figure out how to balance out the negative effects of those externalities and maintain the whole system in a steady state.  Societies that treated technological progress as an option rather than a requirement, and recognized the downsides to increasing complexity, could also choose to reduce complexity in one area in order to increase it in another, and so on—or they could just raise a monument to the age of progress, and go do something else instead.

The logic suggested here requires a comprehensive rethinking of most of the contemporary world’s notions about technology, progress, and the good society. We’ll begin that discussion in future posts—after, that is, we discuss a second dimension of progress that came out of last week’s discussion.