Bob Ellis, ABC, Australia - Fewer deaths occurred in Hiroshima in August 1945 than in Port-au-Prince last week and more people will die there soon than in Rwanda in 1994. Yet the modern global world was unprepared for it, so busy were they with terrorism, which has killed fewer people in the last thirty years than quarrelsome Americans with handguns in the last eight months.

When are we going to get the arithmetic right, and distinguish what threatens us mightily from what threatens us barely at all?

Cuba, a socialist state, is well-prepared for natural disaster and few die there in the hurricane season, and rebuilding happens quickly. The United States, a capitalist nation, was ill-prepared for Hurricane Katrina though experts had warned for years of broken dikes, inundation, chaos, disease and looting, and its response was an international joke.

China, a socialist state, handles earthquakes well. Australia, a social democratic state, handles floods and bushfires fairly well. Yet on the US's back doorstep a million people may die soon, thirsting to death under piles of bricks or in those rapidly-spreading diseases that follow earthquake, unhelped by America whose borrowed billions were that day bombing Kandahar not funding ambulance teams in Port-au-Prince.

When will we get our priorities right, and realize our biggest foe is wild nature not militant Islam and do such things as we can to survive it?

Thirty per cent of earth's carbon asphyxiation comes each year from bushfires, and for a hundred billion in the next ten years we could fly a lot of Elvises over Indonesian forests round the clock and we do not. We're installing electronic peekaboo machines in airports instead, though strangely not in theatre foyers or football stadiums, lest one more Underpants Bomber board a plane. Half a billion dollars would redirect an eastern river and save the Lachlan, Murrumbidgee and Murray but we're spending the money instead on gyms and computers for private schools. While you've been reading this three Haitians have died under heaped-up stone unrescued and an AIG executive has earned two hundred dollars for helping wreck the world economy, and he'll earn three thousand more in the next hour while twenty more Haitians die.

When will we get our priorities right, and learn how useless the free market is in dealing with tsunamis, earthquakes, Aboriginal health, African AIDS, Middle Eastern pogroms, Chinese tyranny and the sort of shameful poverty that breeds terrorists everywhere and sends them walking in explosive underpants out of universities into airline waiting rooms? When will we understand that twenty dollars a week is better spent on tax-funded air ambulances and Elvises and hot rocks and wind power and stem cells and solar cars than on oil magnates who are killing the planet as we speak?

You can argue that Haiti was a basket-case already and had been that way for decades. But in those selfsame rapacious decades capitalist America had been refusing aid to it, and sending back Haitian immigrants who might not then have starved, or whored themselves, or taken up voodoo if America had taken them in, and not spent their money instead on the drug-running, election-cheating Karzai brothers for reasons that altered each month. Why was faraway Afghanistan more urgent to America than Haiti, its near neighbor? Why? Oh yes, that's right, a certain tall Saudi was thought to be living there so it had to be bombed to smithereens. Makes all the sense in the world, when you come to think of it. . .

Why are we getting it so wrong? Why are we so afraid of tax, and so welcoming of useless executives on ten million a year, or a hundred and forty an hour around the clock? Why are we spending so much of our money on them, and so little on bushfire prevention or flood rescue? Why are so many people dying because we find a young stranger's jockstrap more interesting than the end of life on earth? Are some people making money, perhaps, out of emphasising the unimportant and spinning the planet's fate into invisibility? Arms manufactures, oil barons, Halliburton, Blackwater and so on?

Could be, old friend, could be.

Helicopter-gunships have been illegally over-flying Pakistani villages while you've been reading this, and they could have been rescuing buried children in Port au Prince.

Was this well done? Was this honorably done? How clever was it? How useful in discouraging future terrorists to reject jihad and choose capitalism instead?

How are we doing, old friend?

Are we winning?

Or are they?

Saturday, January 16, 2010


John Taylor Gatto, 1989 - We live in a time of great social crisis. Our children rank at the bottom of nineteen industrial nations in reading, writing, and arithmetic. The world's narcotic economy is based upon our own consumption of this commodity. If we didn't buy so many powdered dreams, the business would collapse - and schools are an important sales outlet. Our teenage-suicide rate is the highest in the world - and suicidal kids are rich kids for the most part, not poor. In Manhattan, 70 percent of all new marriages last less than five years.

Our school crisis is a reflection of this greater social crisis. We seem to have lost our identity. Children and old people are penned up and locked away from the business of the world to an unprecedented degree; nobody talks to them anymore. Without children and old people mixing in daily life, a community has no future and no past, only a continuous present. In fact, the term "community" hardly applies to the way we interact with each other. We live in networks, not communities, and everyone I know is lonely because of that. In some strange way, school is a major actor in this tragedy, just as it is a major actor in the widening gulfs among social classes. Using school as a sorting mechanism, we appear to be on the way to creating a caste system, complete with untouchables who wander through subway trains begging and sleep on the streets.

I've noticed a fascinating phenomenon in my twenty-nine years of teaching - that schools and schooling are increasingly irrelevant to the great enterprises of the planet. . . The truth is that schools don't really teach anything except how to obey orders. This is a great mystery to me, because thousands of humane, caring people work in schools as teachers and aides and administrators, but the abstract logic of the institution overwhelms their individual contributions. Although teachers do care and do work very, very hard, the institution is psychopathic; it has no conscience. It rings a bell, and the young man in the middle of writing a poem must close his notebook and move to a different cell, where he learns that humans and monkeys derive from a common ancestor. . .

Now, here is a curious idea to ponder: Senator Ted Kennedy's office released a paper not too long ago claiming that prior to compulsory education the state literacy rate was 98 percent, and after it the figure never again climbed above 91 percent, where it stands in 1990. I hope that interests you.

Here is another curiosity to think about: The home-schooling movement has quietly grown to a size where 1.5 million young people are being educated entirely by their own parents. Last month the education press reported the amazing news that children schooled at home seem to be five, or even ten years ahead of their formally trained peers in their ability to think.

I don't think we'll get rid of schools any time soon, certainly not in my lifetime, but if we're going to change what's rapidly becoming a disaster of ignorance, we need to realize that the institution "schools" very well, but it does not "educate"; that's inherent in the design of the thing. It's not the fault of bad teachers or too little money spent. It's just impossible for education and schooling to be the same thing.

Schools were designed by Horace Mann and Barnas Sears and W.R. Harper of the University of Chicago and Edward Thorndike of Columbia Teachers College and other to be instruments for the scientific management of a mass population. Schools are intended to produce, through the application of formulas, formulaic human beings whose behavior can be predicted and controlled.

To a very great extent, schools succeed in doing this. But our society is disintegrating, and in such a society, the only successful people are self-reliant, confident, and individualistic - because the community life that protects the dependent and weak is dead. The products of schooling are, as I've said, irrelevant. Well-schooled people are irrelevant. They can sell film and razor blades, push paper and talk on telephones, or sit mindlessly before a flickering computer terminal, but as human beings they are useless - useless to others and useless to themselves.

The daily misery around us is, I think, in large measure caused by the fact that - as social critic Paul Goodman put it thirty years ago - we force children to grow up absurd. Any reform in schooling has to deal with school's absurdities.

It is absurd and anti-life to be part of a system that compels you to sit in confinement with only people of exactly the same age and social class. The system effectively cuts you off from the immense diversity of life and the synergy of variety. It cuts you off from your own past and future, sealing you in a continuous present, much the same way television does.

It is absurd and anti-life to be part of a system that compels you to listen to a stranger reading poetry when you want to learn to construct buildings, or to sit with a stranger discussing the construction of buildings when you want to read poetry.

It is absurd and anti-life to move from cell to cell at the sound of a gong for every day of your youth, in an institution that allows you no privacy and even follows you into the sanctuary of your home, demanding that you do its "homework.". . .

Keep in mind that in the United States almost nobody who reads, writes, or does arithmetic gets much respect. We are a land of talkers; we pay talkers the most and admire talkers the most, and so our children talk constantly, following the public models of television and schoolteachers. It is very difficult to teach "the basics" anymore, because they really aren't basic to the society we've made.

Wednesday, December 2, 2009


Thursday, September 3, 2009


From an extremely interesting speech by danah boyd, a social media researcher at Microsoft Research New England and a fellow at Harvard Law School's Berkman Center for Internet and Society. Reprinted by Alternet

dana boyd - For decades, we've assumed that inequality in relation to technology has everything to do with "access" and that if we fix the access problem, all will be fine. This is the grand narrative of concepts like the "digital divide."

Yet, increasingly, we're seeing people with similar levels of access engage in fundamentally different ways. And we're seeing a social media landscape where participation "choice" leads to a digital reproduction of social divisions. . . .

Let's deal directly with a very specific case study: MySpace versus Facebook. . .

Two weeks ago, comScore released numbers showing that Facebook and MySpace were neck-and-neck in terms of unique user visits in the U.S. The meta-narrative was that Facebook was winning in the States, and that MySpace was dying.

I would argue that the numbers can be read differently. The numbers show that MySpace has neither grown nor faded in the last year, while Facebook has expanded rapidly and has finally reached the same size. . .

But we still need to account for the fact that as many people visit MySpace as Facebook . . . Even if you think that Facebook is winning the game, we need to account for the fact that 70 million people in the U.S. visited MySpace. That's not small potatoes. . .

I'm an ethnographer. For the last four years, I've been traveling the United States, talking to American teenagers about their use of social media. During the 2006-2007 school year, I started noticing a trend.

In each school, in each part of the country, there were teens who opted for MySpace and teens who opted for Facebook. (There were also plenty of teens who used both.) . . .

MySpace came out first and quickly attracted urban 20-somethings. It spread to teenagers through older siblings and cousins, as well as those who were attracted to indie rock and hip-hop music culture.

Facebook started at Harvard and spread to the Ivy Leagues before spreading more broadly. The first teenagers to hear about Facebook were those connected to the early adopters of Facebook (i.e. the Ivy League-bound types). The desirability of the site spread from those college-bound teens.

As word of these sites spread, teens went to where their friends were. The origin points of these sites explain many of people's choices, especially when it comes to first adoption, because people adopt the sites that their friends adopt. Yet, it doesn't explain why people some people left MySpace to join Facebook and others did not.

One way of thinking about the transition from MySpace to Facebook is through the frame of fashion cycles and fads. MySpace was first; arguably, some people got sick of it and, when Facebook came along, voila! This is certainly true for many teens (and adults), but this explanation would only work if MySpace was dead, or if users of MySpace thought of it as uncool.

The fact is MySpace is still quite popular among a certain segment of the population. Only a month ago, I was doing fieldwork in Atlanta, where I found heavy usage of MySpace among certain groups of youth. They knew of Facebook but had no interest in leaving MySpace to join Facebook.

Herein lies the reality that makes all of this quite messy to deal with. . .

Whites were more likely to leave or choose Facebook. The educated were more likely to leave or choose Facebook. Those from wealthier backgrounds were more likely to leave or choose Facebook. Those from the suburbs were more likely to leave or choose Facebook. Those who deserted MySpace did so by "choice," but their decision to do so was wrapped up in their connections to others, in their belief that a more peaceful, quiet, less-public space would be more idyllic.

This dynamic was furthered by the press, an institution that stems from privilege and tends to reflect the lives of a more privileged class of people. They narrated MySpace as the dangerous underbelly of the Internet, while Facebook was the utopian savior. . .

The fact that digital migration is revealing the same social patterns as urban white flight should send warning signals to everyone out there. And if we think back to the language used by teens who use Facebook when talking about MySpace, we should be truly alarmed.

Those who are from privileged backgrounds tend to be far more condescending toward those who are not than vice versa. . .

The data have consistently shown that MySpace is not a site of increased risk for youth and that risky behavior is more likely to occur in chatrooms than on MySpace. Yet, if you're a parent of a teen in this room, you're probably scared shitless of MySpace.

Why? What are you scared of? Are you scared of the site, or the possibility that your child might be exposed to values that are different than yours? Are you scared of the display of sexuality, or just the display of working-class sexuality? Needless to say, that's a topic for a whole different conversation. . .

Unlike teens, who are often straddling MySpace and Facebook, most adults are active on one or the other, unless they have a specific professional or hobby-based reason to be on both. . .

In many ways, adult worlds are even more divided than teen worlds. Adults are less likely to know other adults who aren't like them than teens are.

There's a concept in sociology called "homophily." It means birds of a feather stick together. Whites know whites. Democrats know Democrats. Urbanites know urbanites. Tech people know tech people. Rich people know rich people. . .

One thing to keep in mind about social media: the Internet mirrors and magnifies pre-existing dynamics. And it makes many different realities much more visible than ever before. . .

So why am I telling you that Facebook and MySpace are divided by race, class, education and other factors? Because it matters. And we need to talk about and address the implications of this divides.

First off, when people are structurally divided, they do not share space with one another, and they do not communicate with one another. This can and does breed intolerance. . .

Think about this in the context of the politics around gay rights. The No. 1 predictor for how someone will side in issues of gay rights is whether or not they know someone who is gay. . .

When you choose MySpace or Facebook, you can't send messages to people on the other site. You can't Friend people on the other site. There's a cultural wall between users. And if there's no way for people to communicate across the divide, you can never expect them to do so.

But here's the main issue with social divisions. We can accept when people choose to connect to people who are like them and not friend different others. But can we accept when institutions and services only support a portion of the network? When politicians only address half of their constituency? When educators and policy makers engage with people only through the tools of the privileged?

When we start leveraging technology to meet specific goals, we may reinforce the divisions that we're trying to address.

If you want people to connect around politics and democracy, information and ideas, you need to understand the divisions that exist.

Many of us in this room see social-network sites as a modern-day incarnation of the public sphere. Politicians log in to these sites to connect with constituents and hear their voices. Campaign managers and activists try to rally people through these sites. Market researchers try to get a sense of people's opinions through these sites. Educators try to connect with students and build knowledge-sharing communities. This is fantastic. But there isn't one uniform public sphere. There are numerous publics (and counter-publics).

In many ways, the Internet is providing a next-generation public sphere. Unfortunately, it's also bringing with it next-generation divides.

Thursday, August 7, 2008


From a sermon at a Unitarian Church in Brunswick, Maine, by Weld Henshaw

I am an atheist. . . I thought it daring to begin my brief sermon with these words, these four words. Brief - the sine qua non of any summer sermon. But I can’t just stop after four words. First, that sentence is not fully honest. Second, I should explain this derives from a real sermon by a real preacher at The Old Ship Meeting House in Hingham twenty-odd years ago. Freshly called to Old Ship, Ken Read-Brown started with, “I am an agnostic.”

It took no small amount of courage and honesty for a rookie with a young family dependent on his ministry. The rest of that sermon was magnificent, something I have never forgotten. Any faith to be true has to be anchored in the bedrock of honesty. And when honesty calls for the confessions of doubt, so be it. . .

So, if I have doubts about the atheist bit, why did I use it? The answer lies in a point, made clear by celebrated evolutionary biologist Richard Dawkins. . . Let us guess that in the sentient population, something like twenty per cent are true blue “I know my God is real” types (and “He walks with me and he talks with me …”) Then there is, say, another group who believe in God with an unsteady faith, a belief something short of certainty, but who self describe as believers.

I read recently that a majority of Americans have only a ‘weak” belief in an afterlife. So down our this slippery slope we have another category, folks who somewhat vaguely believe in God but give it little thought and are conscious of passing clouds, ideations of agnosticism. Next to them are frank agnostics, who think it’s all beyond our ken. Some of them, I for example, do not really believe in God and see no arguments that lead to faith in a real God. We could be called super-agnostics or non-assertive atheists. Without conviction, we guess God might well be a delusion, our own construct to fend off bleak thoughts of future non-being.

Finally, there are true blue flat-out atheists who know no god exists. These tough minded folk, tiny in number; hold an assertive no-doubts atheism to me flawed by a certainty where certainty does not obtain. Not even Dawkins identifies with these deniers; I’m with Dawkins on this. It just doesn’t make sense to have leaps of non-faith. So this puts me as a leftist agnostic and something close to a functional atheist. I am unaware of any miracle or answered prayer in all human history. To me, the greatest miracle ever was Bill Mazeroski’s home run in the ninth inning of the seventh game of the 1960 World Series, an event ignored by all theologians save a few from my home town, Pittsburgh. . .

The Blind Watchmaker makes plain and irrefutable that natural selection, the adaptations to conditions and changing conditions, will alter and enhance all life to survive in optimum harmony with its environment. A key to comprehending evolution is that over enough time even tiny advantageous mutations will gradually win out in the ensuing generations of reproduction.

A second lesson from Dawkins is that events do not require causes. It is generally accepted among the educated that our universe began with a monstrous explosion some 15 billion years ago. People far smarter than I are today busy trying to study the first nanoseconds of this Big Bang. What seems well established now is that events, including the Big Bang, do not require a cause or causes. This discovery has been a second setback for creationists. . .

A last mystery, the first life on earth, is still beyond man’s ken, though theories are being postulated and tested. Lightening bolts into primordial soup? Pure speculation. Self-replicating crystals adopting reproductive genes? Intriguing, beguiling, not proven. Stay tuned. . .

So here I take my waffling stand: I am a hedging atheist. So why do I go to this church? Well, I get to associate with wonderful citizens, people filled with warmth, virtue, humor and empathy. I hear things worth hearing. I sing great songs, Most of all, I get to rub shoulders with fine people who live by our seven principles. Add to that I find shared values and principles far more compelling than shared superstitions.

If Unitarians had a major role in the management of this country, as once we did, how very different would be the conduct of our government towards our citizens, towards other governments and their citizens and towards our planet and all things living or inanimate upon it.

The inherent worth and dignity of every person. Justice, equity and compassion in human relations. Acceptance of one another and encouragement to spiritual growth. A free and responsible search for truth and meaning. The right of conscience and the embrace of the democratic process. The goal of world community with peace, liberty and justice for all. Respect for the interdependent web of all existence.

There is nothing in these seven wise principles that requires a deistic faith, nor anything that excludes it. . .

Thomas Hardy once stated that he had been looking for God for fifty years and, that if he existed, Hardy should have found him. My inquiry has been for even longer and some of Hardy’s heroes, particularly the obscure Jude, have helped lead me to where I find myself today. When I was a young Episcopalian Sunday School scholar, I was a believing and devout Christian. I was also terrified of my own inevitable though far-off death. When I assumed there was a God, he struck me as capricious and, too often, cruel beyond all reason. Now, having reached three score and ten, philosophy - Unitarian philosophy - gives me a large measure of peace about an extinction that cannot be far away. My faith is firmly placed in doubt. My principles and this aging carcass are very much at home right here. And our death does not remove us from this interconnected existential web, not ever.

And now, our finale, after which, let us go forth in peace, go forth in doubt, embrace our daunting conundrum; we have light and we have dark.

Wednesday, June 18, 2008



From congressional testimony

That term "the economy". . . what it means, in practice, is the gross domestic product or GDP. It's just a big statistical pot that includes all the money spent in a given period of time. If the pot is bigger than it was the previous quarter, or year, then you cheer. If it isn't bigger, or bigger enough, then you get Bernanke up here and ask him what the heck is going on.

The what of the economy makes no difference in these councils. It never seems to come up. The money in the big pot could be going to cancer treatments or casinos, violent video games or usurious credit card rates. It could go towards the $9 billion or so that Americans spend on gas they burn while they sit in traffic and go nowhere; or the billion plus that goes to drugs such as Ritalin and Prozac that schools are stuffing into kids to keep them quiet in class.

The money could be the $20 billion or so that Americans spend on divorce lawyers each year; or the $5 billion on identity theft; or the billions more spent to repair property damage caused by environmental pollution. The money in the pot could betoken social and environmental breakdown - misery and distress of all kinds. It makes no difference. You don't ask. All you want to know is the total amount, which is the GDP. So long as it is growing then everything is fine. . .

It isn't just you. The President does it, the media, the reporters sitting at that table over there. They do it too. How many of them or of you asked during the recent debate over the "stimulus" package, exactly what it was that would be stimulated. How many of them say, when Bernanke comes up here to report on the nation's growth, "Hey wait a minute. What exactly are we talking about here?" Doesn't it matter whether it is textbooks or porn magazines, childbirths or treatments for childhood asthma born of bad air? Doesn't it matter whether the expenditure comes from living within our means or from going into financial and ecological debt? Don't we need to know such things before we can say whether the increase in transactions in the pot - what we call "growth" -- has been good or not? This is not an argument against growth by the way. To be reflexively against growth is as numb-minded as to be reflexively for it. Those are theological positions. I am arguing for an empirical one. Let's find out what is growing, and the effects. Tell us what this growth is, in concrete terms. Then we can begin to say whether it has been good or not.

The failure to do this is insane, literally. It is an insanity that is embedded in the political debate, and in media reportage. . . We hear for example that efforts to address climate change will hurt "the economy." Do they mean that if we clean up the air we will spend less money treating asthma in young kids? That Americans will spend fewer billions of dollars on gasoline to sit in traffic jams? That they will spend less on coastal insurance if the sea level stops rising? There is a basic fallacy here. The atmosphere is part of the economy too - the real economy that is, though not the artificial construct portrayed in the GDP. It does real work, as we would discover quickly if it were to collapse. Yet the GDP does not include this work. If we burn more gas, the expenditure gets added to the GDP. But there is no corresponding subtraction for the toll this burning takes on the thermostatic and buffering functions that the atmosphere provides. (Nor is there a subtraction for the oil we take out of the ground.) Yet if we burn less gas, and thus maintain the crucial functions of the atmosphere, we say "the economy" has suffered, even though the real economy has been enhanced. With families it's the same thing. By the standard of the GDP, the worst families in America are those that actually function as families - that cook their own meals, take walks after dinner and talk together instead of just farming the kids out to the commercial culture.

Cooking at home, talking with kids, talking instead of driving, involve less expenditure of money than do their commercial counterparts. Solid marriages involve less expenditure for counseling and divorce. Thus they are threats to the economy as portrayed in the GDP. By that standard, the best kids are the ones that eat the most junk food and exercise the least, because they will run up the biggest medical bills for obesity and diabetes.

This kind of thinking has been guiding the economic policy minds of this country for the last sixty years at least. Is it surprising that the family structure is shaky, real community is in decline, and kids have become Petri dishes of market-related dysfunction and disease? The nation has been driving by a instrument panel that portrays such things as growth and therefore good. It is not accidental that the two major protest movements of recent decades - environmental and pro-family -- both deal with parts of the real economy that the GDP leaves out and that the commercial culture that embodies it tends to erode or destroy. . .

There are so many examples of expenditure that goes into the GDP that has a questionable claim to the stature of growth and good, even from the standpoint of those who make it. For example, much consumption is compulsory, in that buyers have little choice. There is fraud, such as the way seniors are cheated in reverse mortgage scams. There's also products that are designed to lock buyers into an endless stream of high-priced replacements, such as inkjet printer cartridges that are designed to resist refilling.

Or what about car bumpers that are designed not to bump, so that a mild fender bender turns into a $5,000 repair bill? . . .

The toughest case for the economic mind is addiction. The GDP assumes, as most economists do, that people are inherently "rational." What they buy is exactly what they want, and so their purchases must make them happy in exact proportion to the prices paid. Yet addiction has become pervasive. It has metastasized far beyond the usual suspects - gambling. Tobacco, drink and drugs - and come to roost on such things as eating, credit cards, and shopping itself.

How can anyone assume that buying makes people feel better when those very people are engaged in a mighty struggle to do less of it. . .

The GDP makes no distinction between a $500 dinner in Manhattan and the hundreds of more humble meals that could be provided for that same amount. An Upper East Side socialite who buys a pair of $800 pumps from Manolo Blahnik, appears to contribute forty times more to the national well being than does the mother who buys a pair of $20 sneakers at Payless for her son. . . . As included in the national accounts, an accretion of luxury buying at the top covers up a lack of necessary buying at the bottom. As the income scale becomes more skewed, as it has in the U.S., the cover up becomes even greater. In this respect the GDP serves as a statistical laundry operation that hides the suffering at the bottom - when used as a measure of national wellbeing.

Another problem has to do with work, and the toll it takes on those who do it. . . If the GDP subtracts depreciation on buildings and equipment, should there not be a corresponding subtraction for the wearing out of people? What about the loss in the value of their skills as one technology displaces another? In the current accounting, this toll often gets added to the GDP rather than subtracted, in the form of medications, expenditures for retraining, and day care for children as parents work longer hours. Most workers would regard such outlays as costs not gains. . .

I doubt that it is possible to include all the needed information into one single indicator. There are too many apples and oranges. To value a parent's work in the home at the going market price, for example, is both insulting to parents, and an exercise in self-parody for an economics profession that cannot see beyond the realm of market price But at the very least there needs to be an array of indicators that connects such hidden forms of economic function to a larger economic whole. Here are some principles you might find useful. . .

Time is perhaps the most basic form of wealth. Yet Americans, for all their wealth, are the most time-impoverished people on earth. The time they spend both working and consuming - that is, the time absorbed into the market - comes out of the time available for their families and communities; and both are going wanting as a result. Time is a finite resource, just as coal and oil and dump space in the sky are finite resources. To take more of it for work or consumption is to take it from someplace else. You need to look not just at the money and stuff that people have, but also at the time they have. . .

Most of the crucial life-supporting functions take place outside the realm of monetized exchange. They are not part of the market or the government - both of which function through money - but rather occur through natural or social process. The help and care of parents and neighbors; the cooling and cleansing functions of trees; woods in which to hike and hunt; clean water in which to fish and swim; these all are off the books. They do not register in the GDP until something destroys them and people have to buy substitutes in the market. This is insane. A tally of economic wellbeing needs to reflect reality, not just the portion of it that is convenient for economists to measure.

Not everything that is called "consumption" represents advance up the mountain of more. Here are a few examples:

--Compulsory expenditures that are built into products, such as cars designed to cost a fortune to repair, and inkjet printer cartridges designed to resist refilling.

-- Fraud and abuse, such as exorbitant fees built into credit cards that issuers increase whenever they want.

-- Medical bills incurred because of other activities that increase the GDP but degrade the environment. An example is medical bills to treat asthma in children brought on by bad air.

-- Addictive consumption, which is shopping that the shoppers themselves which they could drop. It is hard to see how this could add to wellbeing, when the people are doing it thinks it adds to their own misery instead.

--Defensive consumption, such as the double-pane windows that city dwellers buy to keep out noise from boom box cars and the like on the street.

It is not possible to parse out every single expenditure for its plusses and minuses. But neither is it tenable to assume that every expenditure represents a plus for the individual and society, just because somebody has made it. Yet the GDP starts with just that assumption; or more precisely, the people who interpret the GDP that way do. It is time to begin to make distinctions.

The purpose of an economy is to meet human needs in such a way that life becomes in some respect richer and better in the process. It is not simply to produce a lot of stuff. Stuff is a means, not an end. Yet current modes of economic measurement focus almost entirely on means.

For example, an automobile is productive if it produces transportation. Yet today we look only at the cars produced per hour worked. More cars can mean more traffic and therefore a transportation system that is less productive. The medical system is the same way. The aim should be healthy people, not the sale of more medical services and drugs. Yet today, we assess the economic contribution of the medical system on the basis of treatment rather than results.

Economists see nothing wrong with this. They see no problem that the medical system is expected to produce 30-40% of new jobs over the next 30 years. "We have to spend our money on something," shrugged a Stanford economist to the New York Times. This is more insanity. Next we will be hearing about "disease-led recovery." To stimulate the economy we will have to encourage people to be sick so that the economy can be well.

Jonathan Rowe is a contributing editor of the Washington Monthly and a founder of the Tomales Bay Institute

Monday, May 19, 2008


RICHARD BELL In reading the never-ending conversation among progressives about choosing between an idealistic vote for president or a pragmatic, less-of-two-evils vote, I think we should clearly confront the enormous seduction of presidential politics.

On the one hand, the last office where insurgent political movements are likely to have an impact is the office of President. The playing field is vast, the amount of money involved is in the hundreds of millions, and party hacks have had decades to construct layer after layer of party rules, convention rules, and FEC regulations to minimize the impact of any threat to the underlying economics of the one-party corporate state.

Likewise, the probability of success increases as one moves down the ballot, where the candidates and the voters get closer and closer together.

Yet time and again, especially in presidential election years, we find that the vast majority of progressive energy and money winds up being sucked into the maw of presidential politics, leaving progressive down ballot candidates gasping for resources.

Take money. There is much breast-beating about the huge numbers of small online donors. And compared to having few small donors, the increase has to be welcomed. But one has to stop and ask, what is really going on here? Who's getting this online money? What are the expectations of these online donors? And what will their reactions be when the presidential candidates fail to meet their expectations by, say, not getting out of Iraq?

Look at these numbers from a May 13, 2008 report from the Campaign Finance Institute showing that Democratic House candidates have been outraising their Republican counterparts:

Significantly, these advantages have not been based on the small donor fundraising that has been so important in the first months of 2008 to the presidential campaigns of Senators Barack Obama and Hillary Clinton. From Jan. 1, 2007 through March 31, 2008, Obama raised $232 million, 45% of which came in contributions of $200 or less. Clinton raised $172 million over the same fifteen-month period, with 30% coming in amounts of $200 or less.

By way of contrast, the 1,001 House candidates registered so far with the Federal Election Commission raised a total of $447 million between Jan. 1, 2007 and March 31, 2008. Less than 10% of this total arrived in amounts of $200 or less. This is virtually unchanged from past years. At this time in the cycle, candidates had raised 10% of their money in amounts of $200 or less in, 11% in 2004, 12% in 2002 and 15% in 2000. There is little difference among Republicans and Democrats, but incumbents typically raise a lower percentage of their money from small donors than do challengers (10% versus 16% in the 2008 cycle so far).

Even so, challengers and open seat candidates typically bring in more money from self-financing than from small donors. In districts considered to be competitive by the leading political rating services, the best-funded challengers on average have raised the same amount (14%) from small donors and self-financing.

These numbers form a stunning portrait of the seductiveness of presidential politics for the progressive online community. For a movement whose members are constantly talking about change "from the bottom up," the pattern of donations could hardly be more old-school, top down.

I should make clear that I am personally familiar with the giant sucking sound of presidential politics (as Perot might put it), having been a staffer in John Kerry's 2004 run. I have seen the sausage factory up close, and that experience makes me even more leery of progressive strategies that start at the top.

SAM SMITH, SHADOWS OF HOPE, 1994 Much that is written [about national politics] stays comfortably within the two by three mile area in which one finds the White House and the Congress, the Supreme Court and the State Department, the Pentagon, the Watergate and the National Press Club. As typical pasture in the American west, this spread could support about 120 cows and their calves. The tendency to concentrate our view of politics and of our collective selves upon this tiny enclave has accelerated in recent decades in part because of a dramatic shift in power away from fifty "united states" towards an increasingly centralized and powerful federal government.

But it has also been encouraged by a conglomerated media that requires news topics as ubiquitous as its own expanding corporate structures yet which still can be distilled into a single face or story. Thus Congress has lost power relative to the White House not merely for various political reasons, but because 535 legislators are simply too many for the media to handle. TV, in particular, treats politics much as it does wide screen movies; it snips off the right and left sides until the frame fits comfortably within the more equilateral shape of its eye. The edges of our experience are lost and we find ourselves staring at a comfortable center -- which in the case of politics, means we find ourselves endlessly watching the President while much of the rest of American democracy passes unnoticed. This preoccupation with the presidency not only exaggerates the importance of the position, it distorts the constitutional division of political power, denigrates the significance of state and local government and creates pressures for presidential action when such action may be neither wise nor even lawful. We can not, even out of seemingly harmless celebrity worship, imbue our president with supra-constitutional virtues or powers without simultaneously damaging the Constitution and the democratic system it was established to protect. Besides, our presidential fetish badly skews our view of our country and the changes occurring within it -- not only elsewhere in government but beyond politics entirely. It trivializes our own collective and individual roles in creating social and political change. And, conversely, it can create the illusion of great change when far less is really happening.

Monday, December 24, 2007


[From a speech by the head of the National Trust for Historic Preservation at the National Building Museum]

N - Up to now, our approach to life on this planet has been based on the assumption that "there's plenty more where that came from." With our environment in crisis, we have to face the fact that there may not be "plenty more" of anything - except trouble. In the face of that realization, we're challenged to find a way of living that will ensure the longevity and health of our environmental, economic, and social resources. . .

The United States is a big part of the problem. We have only 5% of the world's population, but we're responsible for 22% of the world's greenhouse gas emissions that are the leading cause of climate change. Much of the debate on this subject usually focuses on the need to reduce auto emissions. But according to the EPA, transportation - cars, trucks, trains, airplanes - accounts for just 27% of America's greenhouse gas emissions, while 48% - almost twice as much - is produced by the construction and operation of buildings. . . In fact, more than 10% of the entire world's greenhouse gas emissions is produced by America's buildings - but the current debate on climate change does not come close to reflecting that huge fact. . .

The retention and reuse of older buildings is an effective tool for the responsible, sustainable stewardship of our environmental resources - including those that have already been expended. . .

According to a formula produced for the Advisory Council on Historic Preservation, about 80 billion BTUs of energy are embodied in a typical 50,000-square-foot commercial building. That's the equivalent of 640,000 gallons of gasoline. If you tear the building down, all of that embodied energy is wasted.

What's more, demolishing that same 50,000-square-foot commercial building would create nearly 4,000 tons of waste. That's enough debris to fill 26 railroad boxcars - that's a train nearly a quarter of a mile long, headed for a landfill that is already almost full.

Once the old building is gone, putting up a new one in its place takes more energy, of course, and it also uses more natural resources and releases new pollutants and greenhouse gases into our environment. Look at all the construction cranes dotting the Washington skyline, and consider this: It is estimated that constructing a new 50,000-square-foot commercial building releases about the same amount of carbon into the atmosphere as driving a car 2.8 million miles.

Since 70% of the energy consumed over a building's lifetime is used in the operation of the building, some people argue that all the energy used in demolishing an older building and replacing it is quickly recovered through the increased energy efficiency of the new building - but that's simply not true. Recent research indicates that even if 40% of the materials are recycled, it takes approximately 65 years for a green, energy-efficient new office building to recover the energy lost in demolishing an existing building. And let's face it: Most new buildings aren't designed to last anywhere near 65 years. Despite these surprising statistics and many more like them, we persist in thinking of our buildings as a disposable - rather than a renewable - resource.

A report from the Brookings Institution projects that by 2030 we will have demolished and replaced 82 billion square feet of our current building stock, or nearly 1/3 of our existing buildings, largely because the vast majority of them weren't designed and built to last any longer.

That much demolition will create a lot of debris. If we didn't recycle any of the building materials, we'd be left with 5.5 billion tons of waste. That's enough debris to fill almost 2,500 NFL stadiums.

How much energy will it take to demolish and replace those buildings? Enough to power the entire state of California - the 10th largest economy in the world - for 10 years. On the other hand, if we were to rehab just 10% of these buildings, we would save enough energy to power the state of New York for well over a year.

Instead of focusing on generalities, let's look at a specific building - like the one we're in right now.

It's estimated that the National Building Museum contains about 1.5 million bricks. When you consider how much energy it took to make all those bricks, plus how much it took to manufacture the other materials, then transport them to this site and put them all together in this marvelous structure, the total embodied energy in this building is the equivalent of nearly 2 million gallons of gasoline. If we assume the average vehicle gets about 21 miles to the gallon, that means there's enough embodied energy in this building to drive a car about 42 million miles.

All of that energy would be wasted if this building were to be demolished and landfilled. What's more, the demolition itself would require the equivalent of more than 8,700 gallons of gas - and it would create nearly 11,000 tons of waste.

It all comes down to this simple fact: We can't build our way out of the global warming crisis. We have to conserve our way out. That means we have to make better, wiser use of what we've already built. . .

Most recent efforts by the green community place heavy emphasis on new technologies rather than on tried-and-true preservation practices that focus on reusing existing buildings to reduce the environmental impacts associated with demolition and new construction. The most popular green-building rating system, the Leadership in Energy and Environmental Design, or LEED program developed by the U. S. Green Building Council, was designed principally for new construction – underscoring the fact that words like "rehabilitation" and "reuse" haven't had much resonance in the green-building lexicon.

This emphasis on new construction is completely wrong-headed. The statistics I cited earlier tell us clearly that buildings are the problem - but incredibly, we propose to solve the problem by constructing more and more new buildings while ignoring the ones we already have.

No matter how much green technology is employed in its design and construction, any new building represents a new impact on the environment. The bottom line is that the greenest building is one that already exists.

It's often alleged that historic buildings are energy hogs - but in fact, some older buildings are as energy-efficient as many recently-built ones, including new green buildings. Data from the U.S. Energy Information Agency suggests that buildings constructed before 1920 are actually more energy-efficient than buildings built at any time afterwards - except for those built after 2000. Furthermore, in 1999, the General Services Administration examined its buildings inventory and found that utility costs for historic buildings were 27% less than for more modern buildings.

It's not hard to figure out why. Many historic buildings have thick, solid walls, resulting in greater thermal mass and reducing the amount of energy needed for heating and cooling. Buildings designed before the widespread use of electricity feature transoms, high ceilings, and large windows for natural light and ventilation, as well as shaded porches and other features to reduce solar gain. Architects and builders paid close attention to siting and landscaping as tools for maximizing sun exposure during the winter months and minimizing it during warmer months. . .

I'm not suggesting that all historic buildings are perfect models of efficient energy use – but, contrary to what many people believe, older buildings can "go green." The marketplace now offers a wide range of products that can help make older buildings even more energy-efficient without compromising the historic character that makes them unique and appealing. And there's a large and growing number of rehab/reuse projects that offer good models of sustainable design and construction.

More recent buildings - especially those constructed between the 1950s and 1980s - pose a greater challenge. Many of them were constructed at a time when fossil fuels were plentiful and inexpensive, so there was little regard for energy efficiency. In addition, they often include experimental materials and assemblies that were not designed to last beyond a generation.

Today, these buildings make up more than half of our nonresidential building stock. Because of their sheer numbers, demolishing and replacing them isn't a viable option. We must find ways to rehabilitate these buildings and lighten their environmental footprint while still protecting their architectural significance. . .

t makes no sense for us to recycle newsprint and bottles and aluminum cans while we're throwing away entire buildings, or even entire neighborhoods. This pattern of development is fiscally irresponsible, environmentally disastrous and ultimately unsustainable.

Tuesday, December 18, 2007


JIM SMITH, PORTSIDE - They stopped a war, ended racial segregation, set off an explosion of creativity in arts and music, and changed the world. The World War II generation? Think again. It was the much maligned generation of the 60s that did all this, and more.

While we respect the generation of our fathers and grandfathers, we cannot pretend that their achievements during WWII had the breath or depth of the achievements of the 60s generation of their sons and daughters. Every nation invents myths about itself. Some of the biggest whoppers have to do with World War II. It's true that the generation called the greatest fought fascism and were on the winning side. Yet 80 percent of the war against Germany was fought on the eastern front by the Soviet Union. The Russians, beginning in 1941, fought, retreated, and ultimately overcame the greatest war machine in the world, the German army. The U.S. and the British fought on the European continent against the Germans for scarcely 11 months. The U.S. did bear the brunt in the Pacific against a much inferior foe, Japan. That engagement ended not in glory, but in the shame of using atomic weapons against a civilian population for the only time in history.

Of course the WWII generation should be praised for playing a role in the defeat of fascism, but here at home they left racial segregation and Jim Crow laws untouched, and allowed home-grown fascism in the guise of McCarthyism to grow into the biggest threat to our civil liberties of all time, the Bush regime notwithstanding.

Why is the 60s generation the greatest? Because it tore down a lot of walls that needed tearing down. The Freedom Riders - both Black and white - invaded Mississippi without the support of the U.S. Army or National Guard. Some were killed, many were beaten. Yet they were the vanguard of a movement that succeeded in changing laws, and the way people think. They exhibited just as much courage and heroism as did many WWII troops being ordered to advance on enemy positions.

The same thing happened in the fields and barrios of the Southwest. Tens of thousands joined Cesar Chavez's struggle for the rights of farmworkers. And in the cities, mass marches, strikes and demonstrations did for Mexican-Americans and Puerto Ricans what the civil rights movement did for Blacks.

Gay Liberation made the headlines on June 28, 1968 when gay and transgender people stood up to police harassment at the Stonewall Inn in New York.

The Women's movement flowed from millions of women entering the workforce in the 60s, and from women intellectuals taking on the male establishment.

The American Indian Movement was reminding the rest of us that they had not all been victims of genocide and were again capable of fighting for their land and traditions.

The student movement began at UC Berkeley in the early 60s with militant demonstrations against the House Un-American Activities Committee (HUAC), and went on to fight for free speech on campus. . .

The 60s generation made one mistake, and it was a whopper. We thought the millennium had arrived, that the Age of Aquarius was upon us, where peace would replace war and love would replace hate. We underestimated those who had a class interest in keeping millions working meaningless jobs to feed their burgeoning profits.

In large parts of the U.S., especially the mid-west and the south, the 60s cultural revolution had hardly penetrated. Here a love-it-or- leave-it silent majority remained that could easily be manipulated by conniving politicians and corporations. . . .

In the end, the 60s generation had stopped a war, made racism a dirty word, and showed us how to dream of peace, equality and a better world. We may not have set the world free, but our greatness lies in the fact that we tried


DAVID U. HIMMELSTEIN & STEFFIE WOOLHANDLER IN NY TIMES - In 1971, President Nixon sought to forestall single-payer national health insurance by proposing an alternative. He wanted to combine a mandate, which would require that employers cover their workers, with a Medicaid-like program for poor families, which all Americans would be able to join by paying sliding-scale premiums based on their income.

Nixon's plan, though never passed, refuses to stay dead. Now Hillary Clinton, John Edwards and Barack Obama all propose Nixon-like reforms. Their plans resemble measures that were passed and then failed in several states over the past two decades.

In 1988, Massachusetts became the first state to pass a version of Nixon's employer mandate — and it added an individual mandate for students and the self-employed, much as Mrs. Clinton and Mr. Edwards (but not Mr. Obama) would do today. Michael Dukakis, then the state's governor, announced that "Massachusetts will be the first state in the country to enact universal health insurance." But the mandate was never fully put into effect. In 1988, 494,000 people were uninsured in Massachusetts. The number had increased to 657,000 by 2006.

Oregon, in 1989, combined an employer mandate with an expansion of Medicaid and the rationing of expensive care. When the federal government granted the waivers needed to carry out the program, Gov. Barbara Roberts said, "Today our dreams of providing effective and affordable health care to all Oregonians have come true." The number of uninsured Oregonians did not budge.

In 1992 and '93, similar bills passed in Minnesota, Tennessee and Vermont. Minnesota's plan called for universal coverage by July 1, 1997. Instead, by then the number of uninsured people in the state had increased by 88,000.

Tennessee's Democratic governor, Ned McWherter, declared that "Tennessee will cover at least 95 percent of its citizens." Yet the number of uninsured Tennesseans dipped for only two years before rising higher than ever.

Vermont's plan, passed under Gov. Howard Dean, called for universal health care by 1995. But the number of uninsured people in the state has grown modestly since then.

The State of Washington's 1993 law included the major planks of recent Nixon-like plans: an employer mandate, an individual mandate for the self-employed and expanded public coverage for the poor. Over the next six years, the number of uninsured people in the state rose about 35 percent, from 661,000 to 898,000.

As governor, Mitt Romney tweaked the Nixon formula in 2006 when he helped devise a second round of Massachusetts health care reform: employers in the state that do not offer health coverage face only paltry fines, but fines on uninsured individuals will escalate to about $2,000 in 2008. On signing the bill, Mr. Romney declared, "Every uninsured citizen in Massachusetts will soon have affordable health insurance." Yet even under threat of fines, only 7 percent of the 244,000 uninsured people in the state who are required to buy unsubsidized coverage had signed up by Dec. 1. Few can afford the sky-high premiums.

Each of these reform efforts promised cost savings, but none included real cost controls. As the cost of health care soared, legislators backed off from enforcing the mandates or from financing new coverage for the poor. Just last month, Massachusetts projected that its costs for subsidized coverage may run $147 million over budget.

The "mandate model" for reform rests on impeccable political logic: avoid challenging insurance firms' stranglehold on health care. But it is economic nonsense. The reliance on private insurers makes universal coverage unaffordable.

With the exception of Dennis Kucinich, the Democratic presidential hopefuls sidestep an inconvenient truth: only a single-payer system of national health care can save what we estimate is the $350 billion wasted annually on medical bureaucracy and redirect those funds to expanded coverage. Mrs. Clinton, Mr. Edwards and Mr. Obama tout cost savings through computerization and improved care management, but Congressional Budget Office studies have found no evidence for these claims.

Monday, December 10, 2007


MAYOR ROCKY ANDERSON, SALT LAKE CITY, OCT 27 - We raise our voices in unison to say to President Bush, to Vice President Cheney, to other members of the Bush Administration (past and present), to a majority of Congress, including Utah's entire congressional delegation, and to much of the mainstream media: "You have failed us miserably and we won't take it anymore."

While we had every reason to expect far more of you, you have been pompous, greedy, cruel, and incompetent as you have led this great nation to a moral, military, and national security abyss. You have breached trust with the American people in the most egregious ways. You have utterly failed in the performance of your jobs. You have undermined our Constitution, permitted the violation of the most fundamental treaty obligations, and betrayed the rule of law.

You have engaged in, or permitted, heinous human rights abuses of the sort never before countenanced in our nation's history as a matter of official policy. You have sent American men and women to kill and be killed on the basis of lies, on the basis of shifting justifications, without competent leadership, and without even a coherent plan for this monumental blunder. . .

You have acted in direct contravention of values that we, as Americans who love our country, hold dear. You have deceived us in the most cynical, outrageous ways. You have undermined, or allowed the undermining of, our constitutional system of checks and balances among the three presumed co-equal branches of government. You have helped lead our nation to the brink of fascism, of a dictatorship contemptuous of our nation's treaty obligations, federal statutory law, our Constitution, and the rule of law.

Because of you, and because of your jingoistic false 'patriotism,' our world is far more dangerous, our nation is far more despised, and the threat of terrorism is far greater than ever before. It has been absolutely astounding how you have committed the most horrendous acts, causing such needless tragedy in the lives of millions of people, yet you wear your so-called religion on your sleeves, asserting your God-is-on-my-side nonsense when what you have done flies in the face of any religious or humanitarian tradition. Your hypocrisy is mind-boggling - and disgraceful. What part of "Thou shalt not kill" do you not understand? What part of the "Golden rule" do you not understand? What part of "be honest," "be responsible," and "be accountable" don't you understand? What part of "Blessed are the peacekeepers" do you not understand?

Because of you, hundreds of thousands of people have been killed, many thousands of people have suffered horrendous lifetime injuries, and millions have been run off from their homes. For the sake of our nation, for the sake of our children, and for the sake of our brothers and sisters around the world, we are morally compelled to say, as loudly as we can, "We won't take it anymore!" As United States agents kidnap, disappear, and torture human beings around the world, you justify, you deceive, and you cover up. We find what you have done to men, women and children, and to the good name and reputation of the United States, so appalling, so unconscionable, and so outrageous as to compel us to call upon you to step aside and allow other men and women who are competent, true to our nation's values, and with high moral principles to stand in your places for the good of our nation, for the good of our children, and for the good of our world.

In the case of the President and Vice President, this means impeachment and removal from office, without any further delay from a complacent, complicit Congress, the Democratic majority of which cares more about political gain in 2008 than it does about the vindication of our Constitution, the rule of law, and democratic accountability. It means the election of people as President and Vice President who, unlike most of the presidential candidates from both major parties, have not aided and abetted in the perpetration of the illegal, tragic, devastating invasion and occupation of Iraq. And it means the election of people as President and Vice President who will commit to return our nation to the moral and strategic imperative of refraining from torturing human beings. In the case of the majority of Congress, it means electing people who are diligent enough to learn the facts, including reading available National Intelligence Estimates, before voting to go to war. It means electing to Congress men and women who will jealously guard Congress's sole prerogative to declare war. It means electing to Congress men and women who will not submit like vapid lap dogs to presidential requests for blank checks to engage in so-called preemptive wars, for legislation permitting warrant-less wiretapping of communications involving US citizens, and for dangerous, irresponsible, saber-rattling legislation like the recent Kyl-Lieberman amendment.

We must avoid the trap of focusing the blame solely upon President Bush and Vice-President Cheney. This is not just about a few people who have wronged our country - and the world. They were enabled by members of both parties in Congress, they were enabled by the pathetic mainstream news media, and, ultimately, they have been enabled by the American people-40% of whom are so ill-informed they still think Iraq was behind the 9/11 attacks a people who know and care more about baseball statistics and which drunken starlets are not wearing underwear than they know and care about the atrocities being committed every single day in our name by a government for which we need to take responsibility.

As loyal Americans, without regard to political partisanship as veterans, as teachers, as religious leaders, as working men and women, as students, as professionals, as businesspeople, as public servants, as retirees, as people of all ages, races, ethnic origins, sexual orientations, and faiths we are here to say to the Bush administration, to the majority of Congress, and to the mainstream media: "You have violated your solemn responsibilities. You have undermined our democracy, spat upon our Constitution, and engaged in outrageous, despicable acts. You have brought our nation to a point of immorality, inhumanity, and illegality of immense, tragic, unprecedented proportions."

But we will live up to our responsibilities as citizens, as brothers and sisters of those who have suffered as a result of the imperial bullying of the United States government, and as moral actors who must take a stand: And we will, and must, mean it when we say 'We won't take it anymore.' If we want principled, courageous elected officials, we need to be principled, courageous, and tenacious ourselves. History has demonstrated that our elected officials are not the leaders the leadership has to come from us. If we don't insist, if we don't persist, then we are not living up to our responsibilities as citizens in a democracy and our responsibilities as moral human beings. If we remain silent, we signal to Congress and the Bush administration and to candidates running for office and to the world that we support the status quo.

Silence is complicity. Only by standing up for what's right and never letting down can we say we are doing our part. Our government, on the basis of a campaign we now know was entirely fraudulent, attacked and militarily occupied a nation that posed no danger to the United States. Our government, acting in our name, has caused immense, unjustified death and destruction. It all started five years ago, yet where have we, the American people, been? At this point, we are responsible. We get together once in a while at demonstrations and complain about Bush and Cheney, about Congress, and about the pathetic news media. We point fingers and yell a lot. Then most people politely go away until another demonstration a few months later.

How many people can honestly say they have spent as much time learning about and opposing the outrages of the Bush administration as they have spent watching sports or mindless television programs during the past five years? Escapist, time-sapping sports and insipid entertainment have indeed become the opiate of the masses. Why is this country so sound-asleep? Why do we abide what is happening to our nation, to our Constitution, to the cause of peace and international law and order? Why are we not doing all in our power to put an end to this madness? We should be in the streets regularly and students should be raising hell on our campuses. We should be making it clear in every way possible that apologies or convoluted, disingenuous explanations just don't cut it when presidential candidates and so many others voted to authorize George Bush and his neo-con buddies to send American men and women to attack and occupy Iraq.

Let's awaken, and wake up the country by committing here and now to do all each of us can to take our nation back. Let them hear us across the country, as we ask others to join us: "We won't take it anymore!" I implore you: Draw a line. Figure out exactly where your own moral breaking point is. How much will you put up with before you say "No more" and mean it?

I have drawn my line as a matter of simple personal morality: I cannot, and will not, support any candidate who has voted to fund the atrocities in Iraq. I cannot, and will not, support any candidate who will not commit to remove all US troops, as soon as possible, from Iraq. I cannot, and will not, support any candidate who has supported legislation that takes us one step closer to attacking Iran. I cannot, and will not, support any candidate who has not fought to stop the kidnapping, disappearances, and torture being carried on in our name.

If we expect our nation's elected officials to take us seriously, let us send a powerful message they cannot misunderstand. Let them know we really do have our moral breaking point. Let them know we have drawn a bright line. Let them know they cannot take our support for granted that, regardless of their party and regardless of other political considerations, they will not have our support if they cannot provide, and have not provided, principled leadership.

The people of this nation may have been far too quiet for five years, but let us pledge that we won't let it go on one more day that we will do all we can to put an end to the illegalities, the moral degradation, and the disintegration of our nation's reputation in the world.

Let us be unified in drawing the line in declaring that we do have a moral breaking point. Let us insist, together, in supporting our troops and in gratitude for the freedoms for which our veterans gave so much that we bring our troops home from Iraq , that we return our government to a constitutional democracy, and that we commit to honoring the fundamental principles of human rights.

In defense of our country, in defense of our Constitution, in defense of our shared values as Americans and as moral human beings we declare today that we will fight in every way possible to stop the insanity, stop the continued military occupation of Iraq, and stop the moral depravity reflected by the kidnapping, disappearing, and torture of people around the world.

Monday, September 10, 2007


JONATHAN KOZOL - This morning, I am entering the 67th day of a partial fast that I began early in the summer as my personal act of protest at the vicious damage being done to inner-city children by the federal education law No Child Left Behind, a racially punitive piece of legislation that Congress will either renew, abolish, or, as thousands of teachers pray, radically revise in the weeks immediately ahead.

The poisonous essence of this law lies in the mania of obsessive testing it has forced upon our nation's schools and, in the case of underfunded, overcrowded inner-city schools, the miserable drill-and-kill curriculum of robotic "teaching to the test" it has imposed on teachers, the best of whom are fleeing from these schools because they know that this debased curriculum would never have been tolerated in the good suburban schools that they, themselves, attended.

The justification for this law was the presumptuous and ignorant determination by the White House that our urban schools are, for the most part, staffed by mediocre drones who will suddenly become terrific teachers if we place a sword of terror just above their heads and threaten them with penalties if they do not pump their students' scores by using proto-military methods of instruction -- scripted texts and hand-held timers -- that will rescue them from doing any thinking of their own. There are some mediocre teachers in our schools (there are mediocre lawyers, mediocre senators, and mediocre presidents as well), but hopelessly dull and unimaginative teachers do not suddenly turn into classroom wizards under a regimen that transforms their classrooms into test-prep factories.

The real effect of No Child Left Behind is to drive away the tens of thousands of exciting and high-spirited, superbly educated teachers whom our urban districts struggle to attract into these schools. There are more remarkable young teachers like this coming into inner-city education than at any time I've seen in more than 40 years. The challenge isn't to recruit them; it's to keep them. But 50% of the glowing young idealists I have been recruiting from the nation's most respected colleges and universities are throwing up their hands and giving up their jobs within three years.

When I ask them why they've grown demoralized, they routinely tell me it's the feeling of continual anxiety, the sense of being in a kind of "state of siege," as well as the pressure to conform to teaching methods that drain every bit of joy out of the hours that their children spend with them in school.

"I didn't study all these years," a highly principled and effective first-grade teacher told me -- she had studied literature and anthropology in college while also having been immersed in education courses -- "in order to turn black babies into mindless little robots, denied the normal breadth of learning, all the arts and sciences, all the joy in reading literary classics, all the spontaneity and power to ask interesting questions, that kids are getting in the middle-class white systems." At a moment when black and Hispanic students are more segregated than at any time since 1968 (in the typical inner-city school I visit, out of an enrollment that may range from 800 to 4,000 students, there are seldom more than five or six white children), NCLB adds yet another factor of division between children of minorities and those in the mainstream of society. In good suburban classrooms, children master the essential skills not from terror but from exhilaration, inspired in them by their teachers, in the act of learning in itself. They're also given critical capacities that they will need if they're to succeed in college and to function as discerning citizens who have the power to interrogate reality. They learn to ask the questions that will shape the nation's future, while inner-city kids are being trained to give pre-scripted answers and to acquiesce in their subordinate position in society.

In the wake of the calamitous Supreme Court ruling in the end of June that prohibited not only state-enforced but even voluntary programs of school integration, No Child Left Behind -- unless it is dramatically transformed -- will drive an even deeper wedge between two utterly divided sectors of American society.

This, then, is the reason I've been fasting, taking only small amounts of mostly liquid foods each day, and, when I have stomach pains, other forms of nourishment at times, a stipulation that my doctor has insisted on in order to avert the risk of doing longterm damage to my heart. Twenty-nine pounds lighter than I was when I began, I've been dreaming about big delicious dinners.

Still, I feel an obligation to those many teachers who have told me, not as an accusation but respectfully, that it was one of my books that diverted them from easier, more lucrative careers and brought them into teaching in the first place. Some call me in the evenings, on the verge of tears, to tell me of the maddening frustration that they feel at being forced to teach in ways that make them hate themselves.

I don't want them to quit their jobs. I give them whatever good survival strategies I can. I tell them that the best defense is to be extremely good at what they do: Deliver the skills! Don't let your classroom grow chaotic! A teacher who can keep a reasonable sense of calm within her room, particularly in a school in which disorder has been common, renders herself almost inexpendable. . .

I've tried very hard to convince a number of the more enlightened Democrats who serve on the Senate education panel to introduce amendments that will drastically reduce our government's reliance upon standardized exams in judgment of a child, school, or teacher, and attribute greater weight to factors that are not so simple-mindedly reducible to numbers.

Sophisticated as opposed to low-grade methods of assessment would not only tell us whether little Oscar or Shaniqua started out their essays with "a topic sentence" but would also tell us whether they wrote something with the slightest hint of authenticity and charm or simply stamped out insincere placebos. (A child gets no credit for originality or authenticity under No Child Left Behind. Sincerity gets no rewards. Endearing stylistic eccentricity, needless to say, is not rewarded either. That which can't be measured is not valued by the technocrats of uniformity who have designed this miserable piece of legislation.) . . .

Saturday, August 11, 2007



The commons of my own childhood was abundant in its own way. We lived in an old, close-in Boston suburb, with meandering streets that followed the contours of the land. We played in the old aqua-duct, and at the swamp that formed a wooded basin behind the houses. By unspoken agreement, the entire neighborhood was open to us kids. We played football in one yard, waffle ball in another. We didn't know the owners, and no one seemed to care. . .

Kids now are caught in the temporal rhythms of their parents' business lives: classes, schedules, day-calendars even. When they aren't marching to the clock they are locked into commoditized entertainments. A woman from England observed to me not long ago, "The children here - they don't know how to play."

The inability to play is a symptom of commons deprivation, which is the genus of which nature deprivation is the species. I have a suspicion, which I cannot prove, that much depression and so-called 'attention deficit disorder' is at least partly a result of hyper-enclosed and commoditized lives. . . The defect always must be in the child, never in the commercial culture in which the child is immersed.

We internalize this state of enclosure so much that we are surprised when we encounter something different. That was my response to the splendid children's play area at the Chek Lap Kok Airport in Hong Kong. In the U.S. you sometimes find a perfunctory play structure stationed in a cramped corner by someone who obviously didn't much care. In Hong Kong, by contrast, the space is expansive and inviting, with a foam floor (remove shoes!) and play structures that kids run for.

Someone there actually realized that children about to embark on a 16 hour journey in a cramped plane might need to let off a little steam. Amazing. Even more amazing for an American, it's free.

We had a similar experience is Boston recently. Boston is, on the whole, a hospitable city for kids. There's the public garden and the swan boats, just as they are in Make Way For Ducklings. There's Old Ironsides, the Revolutionary War battleship; the Science Museum, Paul Revere's house, a host of other things. On this trip we discovered something else: public fountains, where kids can splash and wade on hot summer days.

We encountered several: one in the Boston Common, another at Copley Square, and a third at the Christian Science Center. We were there during a heat wave. I was working, but my wife and son were trekking around to the sights. In the late afternoon, they would stop by the fountain at Copley, where Josh would take off his shirt and socks (he's four) and jump in with the other kids. It was a happy festive scene. People dropped their urban guard and became accessible if not instant friends.

The laughter of children is not a common sound in American downtowns. Why exactly is that? What is a more natural use of urban space in the hot summer than a fountain in which kids can play? I couldn't help reflect once again on the strange psychological disorder that sees evil in such scenes because people are not paying money to a private owner for an allotted use. Thankfully someone in Boston is resistant to that inner dysfunction that is so prevalent in America today.

I have no idea whether these open fountains are official policy or whether something slipped through the cracks. Perhaps it is best not to ask. I like to imagine that somewhere on the Boston streets there is a police officer like Michael in Make Way For Ducklings, a hulking Irishman with a big heart (excuse me, hahhhht), who saw the laughter and, in the best tradition of that city, decided that the enforcement of the laws in question could wait for another day.

Saturday, May 26, 2007



Most of my politics have revolved around a single question which many of us regard as key to what we hope will be our country's eventual return to sanity: What conditions are required for a third party on and of the left to successfully compete for power within the American electoral system? What follows is a description of the trajectory of my involvement.

I became convinced in the necessity of a progressive third party with Barry Commoner's presidential campaign of 1980. The first opportunity I had to participate, insofar as it was possible, occurred within Ralph Nader's more or less unofficial candidacy of 1996. When Nader declared his intention to make a serious run in 2000 I would become deeply involved, an experience I document here.

The hours I put into Nader's campaign were not invested in the expectation of his winning or even coming close. Nor, unlike some third party advocates, was my intention to demonstrate the capacity of the left to undermine the coalition required for the Democrats to win national elections. I was aware that the spoiler issue would allow apologists to paint any Democrat, no matter how corrupt and reactionary as preferable to a serious third party challenger. "Those are the guys who brought you George Bush" became a mantra intoned by party hacks in the years since and an effective one: The subsequent decline of the Green Party, most notably in the Democratic machine towns and cities where they had established beachheads testifies to the success of this cynical canard, one for which neither the Greens, nor those urging the Greens to embrace the spoiler role, have yet to develop a convincing rebuttal.

The promise of Nader's Green Party campaign consisted almost entirely in its potential for establishing a foundation for local and state level third party candidacies.

These, according to the plan, would take advantage of the organization set up in the wake of the 2000 election and continue well into the future. At best, some would follow the trajectory of Bernie Sanders, elected to the US Senate last year as an Independent Socialist. The routine downplaying of Sanders' third party affiliation is unfortunate since Sanders' success should lead us to ask the question above in a slightly different form: why we do not have several other Independent Socialists or Greens in the senate, more in Congress and still more in state legislatures and city councils? While there are many reasons why there are not, I would argue that there are no good reasons.

The primary obstacle is not, as many have argued, statutory.

Yes, a thicket of exclusionary laws passed by the majors place an undue burden on third parties - but as Sanders and many others have demonstrated, these can be overcome.

Furthermore, as I would discover in my involvement with the minutiae of election law, the statutes are not uniformly stacked against third parties. For example, cities in Connecticut and around the country have minority party set-asides which can be and have been used to a third party's advantage. Some states are beginning to offer public financing for candidates meeting a fairly minimal threshold of support. Even without these in effect, financial barriers are not insurmountable - though compensating for the inevitable handicap will require the careful organization of a substantial grassroots volunteer base.

If I'm right about this, a larger point follows, namely that the failure of third parties is not written in the stars but is in ourselves - in our failure to take advantage of clear opportunities that are available to us to achieve real political power. The set of attitudes preventing us from doing so are the focus of an essay which takes as its point of departure the slogan "If the people lead, the leaders will follow." Given that our political leaders respond to protest by running in the opposite direction of where we are marching it is time not only to put the slogan out to pasture, but more importantly the assumptions which confer on it its status as a shibboleth. A follow-up piece attempts to make concrete suggestions for the kind of organization a functional left should be developing to move beyond the self-imposed limits of protest politics to participation in power. While I am not alone in making such proposals, that nothing of the sort which I describe seems to have materialized in the interim means that a critical mass of consensus on this point has not been reached among the left, such as it is. Nor are there any signs that it is emerging.

In any case, one not does advance a critique of politics obsessed with rhetoric on the rhetorical level. Issuing one more scholarly monograph, blog entry, or seminar to a room full of like-minded activists . . . . The most effective, arguably the only effective statement, is to show by example that the foundations for real political power can be established, that power can be achieved and exercised in advancing a left agenda on whatever level this is possible.

There was a brief period when Nader campaign nationally and the local Green Party chapters which supported it seemed to be breaking through with this message. Among the most receptive areas were the academic ghettos in New Haven where I was living at the time where Nader came close to beating Gore and obliterated Bush.

It was the low level, nuts and bolts and frequently menial involvement in the Nader campaign which woke some of us up to the potential opening the electoral system provides, even in its presently existing corrupt and anti-democratic form. And a few of us would go further moving beyond the rhetorical politics with which we are most comfortable into what can be reasonably called real politics. In my case this took the took the form of serving as a Green Party member of the New Haven Board of Alderman, winning a special election in July of 2001 and the regular election in November of that year. An account of my experiences in that capacity can be found here. A discussion of the type of campaign strategies which were successful for us can be found here. Some of my other writings from this period including my semi-regular "Halle Sez" columns are on this site here.

Rather than constituting my first steps up the political ladder, my one and a half terms on the Board of Alderman marked the highest rungs I would achieve followed by a precipitous descent for me and the New Haven Greens.

The first reason for my withdrawal is somewhat personal and I share it here because something of the kind is likely to be encountered by other groups having success at electing insurgent candidates to office. I was able to win partly because I had achieved the sort of conventional respectability as a homeowner and taxpayer which made my candidacy palatable to voters who might not otherwise be sympathetic to my politics. Paradoxically, this same profile made it more difficult for me to make the sacrifices that were required to continue function in politics. It would have meant jettisoning a dream job teaching music to undergraduates at Yale in exchange for a political future which may very well have been no future at all. The prospect of long term marginal employment and non-employment as a gadfly activist was decidedly unattractive if not a bit frightening.

Of course, it came as no surprise when the Democratic machine devoted a high fraction of its campaign resources in an unsuccessful effort to unseat me, even to the extent of leaving a vulnerable Republican in another ward unopposed. No more surprising was the active participation of nominal Yale liberals who were then beginning their expected upward trajectory though Democratic Party ranks. Yale liberals in the service of machine politics, frequently in its most thuggish form have been a prominent feature of the New Haven landscape for generations - an arrangement which nurtured the political careers of, among others, Bill Clinton and Joseph Lieberman.

Exposing this marriage of convenience was a factor in my decision to enter politics, so by no means did it constitute a factor in my decision to withdraw from it. Having said that, the blatant albeit well concealed opportunism of Yale liberals was stunning to even a cynic such as myself. And this awareness made me increasingly unsure of the coalition which served as a foundation the local party Reinforcing this insecurity was an incident in the spring following my re-election.

By then war on Iraq was imminent and the local party took the lead in passing one of the country's first resolutions opposing military action as well as sponsoring local demonstrations. One of these would feature a Palestinian Yale Law School student whose appearance provoked accusations of anti-semitism from supporters of Israel, some of whom had been active in the local party. An article in the "alternative" weekly headlined "When Green Turned to Hate" provided a megaphone for the critics. Replete with half truths and character assassination now familiar from attacks on Jimmy Carter, the patent intellectual dishonesty of the coverage was a bit shocking and, unfortunately, likely consequential in undermining some of the support for the party.

While this was upsetting at the time, in retrospect this was only a relatively dramatic instance of the sorts of fissures which any party attempting to forge a left coalition will have to deal with. No party can be all things to all people nor should it be.

The best it can do is to take decent and principled stands one of which is a no brainer: opposition to military aid to governments engaging in widespread and systematic human rights abuses. Such stands will inevitably antagonize those having emotional attachments to Turkey, Saudi Arabia, Indonesia, the United States as well as those who continue to view Israel, against all evidence, as "a light unto nations". The hope, and I believe the reality, is that in the long run, a foundation constructed on integrity and honesty will stand in increasingly stark contrast to the major parties whose politics are based on the windfalls of corporate cash which manipulation, cynicism and dishonesty provide access to. At minimum, political integrity means standing firm against individuals in the party for whom a pet cause-whether it is a zoning variance for a parking lot or the rearming of helicopter war ships to attack Gaza-constitutes a litmus test for their support and being willing to accept the cost in doing so.

The assumption that principled politics are ultimately winning politics might have been vindicated given more time and under more favorable circumstances. Unfortunately, and this amounts to my third reason for my stepping down, the Green's brief rise occurred at the most unfavorable historical moment imaginable in that it overlapped almost exactly with the climate of repression ushered in by 9/11 and the assaults on any and all expressions of political non-conformity. While it might be expected otherwise, the climate in New Haven was no more welcoming or tolerant of dissent than anyplace else. A good indication was the response to an upside-down American flag hung outside of a Yale dorm window which would precipitate a forced entry by two by four wielding student athletes accompanied by the following message: "I love kicking the Muslims ass bitches ass! They should all die with Mohammad. We as Americans should destroy them and launch so many missiles their mothers don't produce healthy offspring. F*ck Iraqi Saddam following f*cks." A letter appearing in the Yale Daily News was representative of the reaction of much of the Yale community: Both sides should apologize, those perpetrating a criminal act as well as the student guilty of making the anti-war statement.

It came as no surprise that that in the months following, anti-war protest at Yale was conspicuous by its absence. The only regular table on Beineke Plaza was not that of organizers arranging attendance at large peace demonstrations in New York and Washington but Students Against Nuclear Iran organized by supporters of Israel to advocate for an expansion of the U.S. military in the Mideast. Obviously, such a climate in the most "liberal" bastion of a presumably liberal city did not bode well for a party committed to non-violent resolutions to international conflicts.

In short, the internal divisions in the Greens made us vulnerable, as did the organizational failures mentioned above for which I share some responsibility. But even had all of us devoted more of our energies and even had we been informed with divine foresight in our political decisions, it seems more or less inevitable that the right wing tidal wave unleashed by 9/11 would have forced us into a holding pattern at best, and probably a decline more or less on the scale of that which would occur. An excellent candidate attempting to assume my seat on the board would be defeated by a standard issue Yale technocrat. The other Green elected in 2001, a born again Christian and former Bush supporter, whose commitment to left politics was never more than dubious, would leave the party in 2003 as would another equally opportunistic local politician in Hartford. A good hearted but embarrassingly inarticulate Green has become a perennial candidate in elections at all levels. This included the 2006 Senatorial race in which his performance in nationally covered debates reinforced the stereotype that the Greens are "not ready for prime time."

Where the Greens have ended up is obviously disappointing to those of us who had devoted time, money and energy to the party. That said, our disappointment is mitigated by the fact that we recognized going in that our investment was a gamble. The potential payoff was that which I alluded to above: the beginnings of a move towards a real oppositional party of a sort that most of the developed world takes for granted and which is a necessary condition for a real shift in the political center of gravity to take place here. We ended up losing our wager-many hours writing press releases, creating databases, studying voter lists, getting to know our neighbors and neighborhood-none of this ended up creating a foundation for independent, non-corporate candidates to move up the political ladder.

But in itemizing our losses, it becomes apparent that these are really not losses at all.

For what all this amounts to is not wasted energy but experience in the form of a real world and intimate understanding of how political power works and what is required to achieve it. In particular we understand the precise nature of the rotten core which lies at the heart of urban machines. And we demonstrated how a small number of political novices are able to successfully challenge it and begin to build a foundation for a broader insurgency.

The knowledge we acquired does not seem, at the present moment, to be particularly relevant or useful. Since the 2004 election, a broad consensus has emerged that the only hope for effecting a transformation of the political system involves reforming the Democratic Party from the inside. While I don't expect that the reformist project will succeed, I am in no way hostile to it and I have a great deal of respect for some of those who are advancing it. In particular, I find David Sirota's writings and his organizing to be consistently well informed, savvy and credible. Sirota, and like minded organizations such as the Progressive Democrats of America are undoubtedly aware of the wholesale corruption of urban Democratic machines, but their focus is on national and state level races and on two objectives in particular: driving the Republicans further into the ditch while attempting to wrest control of the party from DLC hacks such as Rahm Emanuel and Steny Hoyer who have operated it for a generation.

The results of the 2006 elections provided some grounds for optimism on achieving the first objective and it seems likely that 2008 will bring further gains. The second objective is a much tougher nut to crack. There is no evidence that grassroots base of the party- the Wellstonian Democratic wing of the Democratic party-has had its hand strengthened in any significant respect. . .

In perhaps the most significant indication of the near total grip of the corporate wing, not only does every serious presidential candidate pay homage either explicitly or implicitly to the corporate-militarist wing, unlike in previous years there is not even any discussion of a candidate who be credibly claimed to represent the party's left. A year before the first primaries and the Kucinich candidacy has already become a joke. Judging from the polls where he doesn't even register, I may be one of the few who even knows who Mike Gravel is and what he stands for-or at least used to. That Wesley Clark, a known war criminal, has assumed the role of a marginal peace candidate is among the more bizarre ironies of the through the looking glass political system which we have made for ourselves.

As for the front runners, comment is no longer necessary on Hillary Clinton following her having received the endorsement of Rupert Murdoch. In contrast, her main competitor Obama makes for a particularly impressive candidate-virtually a platonic ideal- for those concerned with the continuing dominance of the nation's political system by those who own it. His recent comment that he would "not play chicken" with American lives by "cutting off funding to the troops" was surely received warmly by military contractors who need not be concerned about reductions in cold war level defense expenditures to address domestic needs. His promotion of ethanol under the guise of an environmentally responsible energy policy was another indication of his ability to provide a veneer of activist legitimacy to what is just one more scam funneling federal subsidies to agribusiness. His abject genuflection before AIPAC immediately on the heels of throwing an old Palestinian friend under a bus signaled as a near certainty that death, torture and misery will continue unabated on the West Bank. Staking out these positions surely played a role in his reaching his $25 million fundraising target, but it came at the expense of any claim to representing minimally decent politics.

While I agree with Sirota that the current dark horse candidate Edwards may have the most credible purchase on populist support, this assessment by Doug Ireland effectively removes the halo which Edwards has managed to acquire. Edwards has recognized that the only hope for his campaign lies in taking advantage of activist energies to compensate for what is likely to be an inability to compete on the financial playing field. This realpolitik recognition in no way should be seen as a repudiation of the more or less classic neoliberal profile staked out during Edwards' tenure in the Senate. That said, it is not impossible that I will register as a Democrat to vote for Edwards in the New York primary if it seems likely that this will make a difference.

Given their support among liberals - some pragmatic, some star struck and others merely opportunistic - it may take some years into an Obama or Edwards administration for an awareness to sink in among the Wellstonians that they have been played for suckers yet again. But if American troops are still dying in Iraq in 2010, no serious attempt is made to deal with global warming, wealth and income disparities continue their steady progression to still more grotesque levels, bloated military budgets prevent addressing a dangerously decaying domestic infrastructure, those who are willing to face the facts will recognize that the reformist strategy had the opposite effect of what was intended. Rather than strengthening the influence of the rank and file core of the Democratic Party, its having yet again pledged its unconditional fealty to the party will have resulted in a further erosion of its already insignificant influence.

When this occurs, we will be back to where we were in the waning years of the Clinton presidency. An awareness will return that the only hope for the future lies in the development of organized politics which is explicitly directed to challenging a two party system, one whose very essence is defined by corporate domination, unchecked militarism and environmental pillage. At this point, the work which the Greens did and for that matter the important ongoing work which some Greens continue to do, will have many lessons for how activists need to direct their energies. If and when this occurs, I and others will again have something to contribute. At the moment, as should be apparent from the rest of this website, I'm perfectly happy remaining on the sidelines while tending to, figuratively speaking though somewhat literally, my own garden here in the Hudson Valley.

Monday, May 21, 2007


The current issue of the Utne Reader (May - June 07) carried a short but sensibly provocative article protesting the stagnation and the cul-de-sac nature of street protests that involve nonviolent civil disobedience. Joseph Hart, the author, asks why the current antiwar movement is so impotent, despite "a staggering 67 percent disapproval of President Bush's handling of the war - a level that matches public sentiment at the tail end of the Vietnam War, when street protests, rallies, and student strikes were daily occurrences."

He believes it is because, quoting Jack DuVall, president of the International Center on Nonviolent Conflict, that "a street demonstration is only one form of protest and protest is only one tactic that can be used in a campaign. If it's not a part of a dedicated strategy to change policy, or to change power, protest is only a form of political exhibitionism."

Both gentlemen are being incomplete. Even without a military draft in place to arouse a larger public, the protestors against the Iraq war have affected the 2006 elections, performed sit-ins in Congressional offices, filed lawsuits against Bush's violations of people's civil liberties, brought Iraqi spokespeople to meet with influential Americans, worked with Iraq veterans against the war as well as with numerous former high ranking military, diplomatic and intelligence officials now retired from service in both Republican and Democratic Administrations who openly opposed the invasion at the outset.

Clearly all this has not been enough to move the Democrats to decisive action.

The obstinate, messianic militarist in the White House remains unmoved. With his ignorance of history itself becoming historic, this latter day obsessively compulsed King George thinks he's a 21st century Winston Churchill.

Through the wide arc of his persistent lawlessness, Mr. Bush has done the country much damage here and abroad. But he has also demonstrated how variously the rule of law can be swept aside with impunity. He is both outside and above the Constitution, federal statutes, international treaties to which the U.S. is solemn signatory, and the restraints of the Congress and the federal courts.

A major restructuring of our laws to embrace the outlaw Presidency under Mr. Bush, or any like-minded successors, now has a solid empirical basis from which to move forward. Presidential outlawry did not start with Mr. Bush. It has been building up for a long time going from the episodic to institutionalized forms.

For example, it is now routine for the courts to opt out of giving any citizen, group or member of Congress legal standing on matters of foreign and military policy even to plead their cases against the President. Here the courtroom door is closed.

For Mr. Bush, what would be repeated criminal negligence by anyone else, there has been immunity from lawsuits by families of soldiers - and there were hundreds of them - who died because they were not provided with body and Humvee armor over three years of more in Iraq. Immunity even from equitable lawsuits seeking a mandamus for obligated action ignored by the President.

The Bush officials had the funds with which to procure these shields but somehow the Halliburtons got more of their urgent attention.

Clearly, the diverse opposition to Bush's war needs to move to higher levels. More meticulous lobbying in Congressional Districts, more pressure to initiate impeachment hearings, more exposure to what the Iraqi people, suffering so terribly, want, much more organized focus by the retired, established military and civilian officials whose previous courage and experience give them great credibility today.

The number of active duty soldiers petitioning their member of Congress to end the war now exceeds twelve hundred. Since 72% of the soldiers in Iraq wanted the U.S. out within six to twelve months in a Zogby poll released very early in 2006, there is more potential from this source of actual military theatre experience.

The timid, anti-war members of Congress require more than all this opposition. Apparently they are looking for intensity, for more people having the war on their minds, demanding that the huge monies for this overseas destruction be turned into providing necessities for their communities.

These lawmakers seem to need to be buttonholed whenever they return to their Districts. In Washington, they keep saying things like, "Yeah, I know the polls but Americans are more interested in American Idol and their iPods."

So, Americans, start the buttonhole movement - at their Congress members' town meetings, at the clambakes they attend this summer, at the local parades where they strut, over at their local office (see the yellow pages listing under U.S. Government for the addresses and phone numbers) and through letters and telephone calls. You count when you make them count you.

Wednesday, May 9, 2007



Joan Hoff is author of Faustian Foreign Policy from Woodrow Wilson to George W. Bush (forthcoming, 2007, Cambridge University Press)

There comes a time, especially during foreign policy crises, when "retreads" are not what is needed if a president wants a "new way forward." A New World Order for the twenty-first century cannot be created by elderly Cold Warriors whose minds contain more Cold War baggage than fresh ideas, whether they consider themselves neo-cons or neo-liberals idealists or realists. Younger diplomatic specialists exist whose careers and intellect have not been tainted by the questionable methods often used by the United States to win the Cold War.

Yet the prevalence of this "retread" phenomenon could clearly be seen in the membership of the bipartisan Iraq Study Group and the unseemly swift Senate approval of Robert Gates to replace Donald Rumsfeld as Secretary of Defense. Not only was the average age of the commission 67, but also fewer than half of its 10 members had been (or were) foreign policy, let alone Middle Eastern, experts and all were Cold Warrior "retreads."

The ISG report seemed designed to rescue Bush, courtesy of his father's "realist" friends and advisers, from the idealist views of the president neo-con advisers with suggestions for a moderate "change of course." It also aimed taking the president's failed Iraqi policy off the political agenda in time for the 2008 election. This is exactly what Secretary of Defense Melvin Laird recommended to President Richard Nixon in March 1969. In essence, the ISG report basically called for the "Iraqization" or "Iraqafication" of the war similar to the "Vietnamization" of that war.

In both cases the object was (is) to co-opt domestic dissatisfaction with the conduct of war by reducing American casualties and turning more of the fighting over to American-trained native soldiers. Nixon initiated this policy in 1969 but the war continued for another three years with the loss of more than 20,000 U.S. military personnel. At the time antiwar groups correctly predicted that "Vietnamization" would prolong rather than end the war. The same will likely prove true of "Iraqization."

Gates, a career intelligence officer and "retread" par excellence, has served every president since Richard Nixon and was a member of the Baker-Hamilton Commission. His most well-known and controversial service occurred when he was deputy director of the CIA under Bill Casey. In that capacity he recommended air strikes against Nicaragua then led by Sandinista leader Daniel Ortega and was later implicated in the Iran-Contra affair. Among other things, he wrote most of Casey's testimony misleading testimony to Congress about these illegal arms sales to Iran via Israel with the profits going to the Contras. Later, Iran-Contra Independent Counsel Lawrence Walsh did not indict Gates, but concluded that his statements about his involvement in the affair "often seemed scripted and less than candid."

After Gates became acting director of the CIA when Bill Casey became ill in December 1986, Reagan nominated him to head the agency, but concerns by Senators prompted Reagan to withdraw his nomination. In 1991 President George H. W. Bush nominated him again and, although thirty-one Democrats voted against him, Gates used his friendship with David Boren, the Democratic chair of Senate Intelligence Committee, to become CIA director until Bill Clinton came to office. In less than a full day of hearings the Senate approved Gates to replace Rumsfeld in December 2006.

What had happened to the doubts about his ethics and the spinning of intelligence in the 1980s? Have the 12 of the Democratic Senators who are still in Congress of the thirty-one who voted against Gates in 1991 forgotten Senator Sam Daschle's (D-SD) words back then: "We can't afford to take the chance that a fellow who has deliberately trimmed intelligence and taken liberties with the truth will reform."

There is no evidence that Gates has ever stood up to his superiors or fostered innovative ideas in his previous positions. His role most likely will be to mollify complaints within the Pentagon and calm criticism from retired military officers. Additionally, he faces Cheney's power as proxy president. In that "retread" battle, odds are good that the vice president will prevail. Not surprisingly, Gates has endorsed the military surge to secure Baghdad as a "new way forward."

Then consider John D. Negroponte who was accused charged with human rights violations as U.S. Ambassador to Honduras for running political cover for CIA-sponsored Honduran death squads fighting with the Nicaraguan Contras against the Sandinistas during and after Iran-Contra affair. In Bush's second term Negroponte became the first Director of National Intelligence (DNI) charged with coordinating the fight against terrorism and now has become the number two man at the State Department. Like Gates, Negroponte has no reputation for challenging the failed, unethical, or unconstitutional policies of his bosses. And both gave misleading testimony to Congress during the height of the Cold War.

Negroponte's replacement, retired Vice Admiral John McConnell, former director of the National Security Agency, went on to direct defense programs at one of the nation's biggest private and intelligence defense contractors, Booz Allen, responsible for coming up with the infamous Total Information Awareness data-mining scheme. McConnell is considered a good, low profile technocrat, but not an independent or creative thinker - let alone someone with the management skills needed to make the intelligence community work because he believes in outsourcing U.S. intelligence operations to private contractors.

Another "retread" with a less than savory reputation is Elliot Abrams. Convicted for giving false testimony and illicit fund raising activity connected to Iran-Contra activity but pardoned by George H. W. Bush, Abrams went from Special Assistant to the President and Senior Director on the NSC for Near East and North African Affairs to Deputy National Security Adviser for Global Democracy Strategy.

Finally, consider the appointments of Fred Fielding, who served under Nixon and Reagan, as White House counsel primarily to ward off investigations - a role he is duplicating in opposing charges against Attorney General Alberto Gonzales. Then Zalmay Khalilzad, a neo-conservative protege of Cheney and member of Project for a New American Century who served under both Reagan and Bush Sr. despite (or because of his) Unocal oil and Taliban connections in Afghanistan, first became ambassador to the new Afghan regime under Hamid Karzai, then ambassador to Iraq after the U.S. invasion and, in 2007, ambassador to the UN. These personnel changes simply surround Bush with more "retreads" whose Cold War track records are suspect and who are not known for being innovative thinkers except when they are misleading Congress.

These "retreads" and others who surround Bush will not come up with new ideas, just new ways to mislead Congress and the American people about a failed policy in Iraq that may destabilize the entire Middle East.

Tuesday, May 8, 2007



In Eugene, Oregon, Jan Spencer is a well known advocate for culture change. He has transformed his quarter-acre suburban property into a permaculture Shangri-la, attracting many visitors. He collaborates with others on projects for culture change. Jan leads bike tours of permaculture sites in Eugene, speaks at public events and writes articles for publication. The following is from Jan's keynote speech at the Lane County Relocalization Conference that took place April 27-28 at two churches:


A localized economy picks up where market based global capitalism ends. No need to wait! One important principle of a localized economy is that of downsizing many of our resource intensive habits. This is a key leverage point. Reducing what we need and use brings us closer to taking care of those needs from local sources with great benefits to the local economy, the environment and culture change. It also deprives the global economy of revenue.

Also important to keep in mind is that an economy is not only about money. It is about taking care of needs. Many needs can be taken care of without money in creative ways that we are not so familiar with, such as barter, volunteering, local currencies, cooperatives, work trades, avoiding poor investments -- to name a few.

A few easy examples:

Home economics. The home can be an appreciable source of food, energy, water and taking care of one another. Home passive solar design, once built, is essentially free heat. A home with a garden and a bike rider means appreciable transportation energy can come from the back yard. Multigenerational living means young and old can help look after each other. Such a home provides many of its own needs outside the money economy.

Reducing our dependence on automobiles -- a huge key leverage point which can reduce foreign policy misadventures and automobile infrastructure costing hundreds of billions of dollars every year. That money, engineering skill, material resources and youthful vitality wasted on car culture can be redirected with great benefit elsewhere such as environmental restoration, public transit, energy conservation programs, urban redesign to reduce auto-dependency. The latter is particularly vital for local ecnomies. Ending automobile dependence will prevent hundreds of billions in poor investments such as iin car-induced public health costs.


Of all the essentials for survival, food is the furthest along in terms of localization and can be seen as a model for taking care of other needs. At present, there are numerous organizations and advocates calling for localizing and supporting local and regional food production.

One local food group has determined that Lane County can essentially feed itself if current non food crops are replaced with edible crops and diets change to eat substantially less meat. A network of local churches in That’s My Farmer actively support local farms in a highly innovative way. Other organizations can make use of this model.

Local food is a key leverage point. It helps keep money local. It also avoids transportation costs and carbon emissions of food from elsewhere and sidesteps many of the uncertainties relating to oil and climate change.

In a localized food system, we can expect more agricultural work to be done by humans. Imagine people from town spending time on a farm at important times of the season. A new kind of farm - community-participated agriculture.

Relocalizing food will mean crop transitions and developing local markets. Crops for local use can provide energy, medicine and fiber as well.

Agriculture happens in town as well. Since experiencing a drastic reduction in oil imports, Cuba has been forced to relocalize in all areas of economy and way of life. They have gone mostly organic with food production. Remarkable amounts of food are produced within Cuban cities and towns, and the nation has developed the research and education infrastructure to help teach people how to grow food.

Exchanging grass for garden: Food Not Lawns contains enormous potential. Instead of a lawn mowing clientele, I foresee enterprising people making arrangements with property owners to convert all or parts of their properties into food production and share the produce with the property owner. Any surplus would go to a neighborhood market. If you are interested, ask me more, and I would be glad to elaborate on this idea.

People with limited space and mobility can grow food effectively in containers.

Roof top gardens are also a great idea along with open spaces at churches, schools, business and public property with interested people taking care of the plants, trees and harvest.

With a shift in economic circumstance, local food security ideas will become increasingly popular.


How would energy look in a relocalized economy? First, we should make conservation -- a key leverage point -- an absolute priority. Reducing demand puts managing demand closer within reach. We can avoid much of the need in the first place.

Eugene Water and Electric Board existing energy conservation should be vigorously marketed. Call them to learn more.

Neighborhood-scale methane gas has tremendous potential, as human waste for several blocks is brought together to the neighborhood bio digester for methane gas to provide cooking fuel and provide great fertilizer as a byproduct.

Architecture and design standards can be a great energy saver. Making full use of the sunny south sides of houses and commercial buildings for passive solar are only common sense. Retrofit existing homes with passive solar. All new construction should be with elevated energy conservation and solar standards with non toxic materials.

Many suburban houses with south facing garages can convert their garages into solar spaces by replacing the garage door with glass and transforming the space inside for more productive use.

Local bio fuels need to be a part of the mix for essential services such as on the farm and for fire trucks, ambulances.

Redesigning the urban landscape along with upgraded public transportation should be a high priority so that the use of automobiles becomes much less of a need.

Neighboring cities and towns can best move in a direction where they develop their own more independent economies so they are no longer bedroom communities. Bus and train service between towns needs to be upgraded such as between Eugene and Coos Bay, south to Medford and north to Portland. Imagine: in the 1920s there were twelve trains per day between Eugene and Portland.

Reducing our energy footprint is one of the smartest choices we can make. Understanding why changing the way we relate to energy is vital and will help build the cohesion and consensus for city and regional policies to support goals of relocalizing our energy supply.

Land Use

Land use is an absolutely critical part of both urban and rural relocalization, and is a tremendous key leverage point. Land use is the stage and set for how we live -- intimately related to transportation, energy, public health, the economy and foreign policy. . . Relocalizing the urban landscape means making much better use of what is already here. A basic goal is for towns and cities to be more compact with the goods and services people need -- much closer to where they live.

Goals put forth by a culture of cohesion would include an urban space that is attractive, inspiring, a joyful place to live, work and play. Think of edible landscaping, public outdoor meeting areas, smart design to make best use of solar assets and natural drainage, green spaces, community centers, while protecting best soil for food production. Compact and thoughtful urban design reduces the need for cars and can nurture community cohesion.

Imagine an entire residential block making best use of available space, transforming space currently taken up by automobiles to play-space, gardens, child care areas, small businesses, solar designed bungalows and more. The city of Eugene approves of block planning and should do far more to promote it, in partnership with neighborhood organizations.

Think of attractive multistory, mixed use urban villages built on existing parking lots. Those locations are already commercial, often with existing businesses, bus routes and utilities. The multistory redevelopment can include new goods, services, employment and culture specifically for that location. Rooftops could become gardens to supply the village natural food store. There would be edible landscaping, convenient transit to other centers and downtown.

Suburban renewal. Suburbia does offer useful assets. Think of turning grass to garden, include solar redesign, rain water catchment, extending the growing season with cold frames, removing concrete, and creating community and fun. There are already well over a dozen suburban renewal projects in Eugene where substantial needs of the residents are met by on site resources.

Land use has the potential for being a catalyst and an enormous key leverage point, for culture change. The jobs created by urban redesign would be great for the local economy.

The community visioning, the cooperative planning, the work parties, the benefits to the environment, our local security, and public health would bring people together like nothing before has.

Culture change

A peaceful ecoculture's values could come from the Koran, the Bible, the Torah, from Buddhist, humanist or pagan musings and just common sense. Compassion, modesty, honesty, material simplicity, reverence for nature - all are considered virtues in practically every great philosophy. What a difference if those virtues, rather than expensive "cheap thrills," were the basis for a civilization.

Recall when you've had a powerful experience with your higher self. I hope that’s easy and that the experience was recent. What were the circumstances -- and let’s omit the pharmacology. You probably were not in a hurry. It could have been in a beautiful place, maybe in nature, maybe in a human created space. Could be you were with people you enjoy and have strong bonds with. You may have been involved in some kind of healthy community project or a festival or totally on your own.

Can we take this positive sensation beyond our closest friends. How would it be if we lived in a neighborhood and community where such experiences were far more common because there was a far higher level of cohesion. A place where you knew you had solid elevated ideals in common with many more people around you. In effect, your inner circle of friends was greatly expanded.

This concept is an enormously important key leverage point: healthy, popular and shared values and goals -- a culture of cohesion will make wise choices and moving towards its goals - using its resources and assets in a highly productive way.

Wednesday, August 30, 2006


Joan Hoff

Over three decades ago on December 21, 1971, Richard Nixon approved the first major cover-up of his administration. He did so reluctantly at the behest of his closest political advisers, Attorney General John Mitchell, Domestic Counselor John Ehrlichman, and Chief of Staff H.R. Haldeman. The public remains ignorant of this seminal event in Nixon's first term and journalists and historians have largely ignored it. The question is why? A recently released Nixon tape transcribed from an enhanced CD produced by the Nixon Era Center provides the clearest answer to this thirty-year-old Nixon secret.

On that December day Nixon agreed to cover-up a criminally insubordinate spying operation conducted by the Joint Chiefs of Staff inside the National Security Council because of the military's strong, visceral dislike of Nixon's foreign policy. In particular, the JCS thought Nixon gone "soft on communism" by reaching out to the Chinese and Russians, and they resented Vietnamization as a way to end the war.

As early as 1976 Admiral Elmo Zumwalt publicly made these military suspicions and resentment abundantly clear in his book, On Watch: A Memoir. "I had first become concerned many months before the June 1972 burglary," Zumwalt wrote, "[about] the deliberate, systematic and, unfortunately, extremely successful efforts of the President, Henry Kissinger, and a few subordinate members of their inner circle to conceal, sometimes by simple silence, more often by articulate deceit, their real policies about the most critical matters of national security." In a word, Zumwalt, like many within the American military elite, thought that Nixon's foreign policies bordered on the traitorous because they "were inimical to the security of the United States."

This atmosphere of extreme distrust led Admiral Thomas Moorer, head of the JCS, to first authorize Rear Admiral Rembrandt C. Robinson and later Rear Admiral Robert O. Welander, both liaisons between the Joint Chiefs and the White House's National Security Council, to start spying on the NSC. For thirteen months, from late 1970 to late 1971, Navy Yeoman Charles E. Radford, an aide to both Robinson and Welander, systematically stole and copied NSC documents from burn bags containing carbon copies, briefcases, and desks of Henry Kissinger, Alexander Haig, and their staff. He then turned them over to his superiors.

The White House became suspicious when Jack Anderson published a column on December 14 entitled, "U.S. Tilts to Pakistan." Such information logically could only have come from meetings of the Washington Special Action Group, December 3 and 4, which discussed the fact that Pakistan was being used as a conduit for the top secret negotiations the Nixon administration was carrying on with China - negotiations that would culminate in rapprochement with that Communist nation the spring of the next year. Clearly someone had leaked the minutes of the WSAG meeting to Anderson and the suspicion fell on the military.

The White House immediately ordered an investigation of this leak and Pentagon Chief Investigator W. Donald Stewart subsequently uncovered the JCS spy operation when Yeoman Radford "broke down and cried" during a polygraph test, indicating that he spied with the "implied approval of his supervisor" Admiral Welander. Stewart believed that it was a "hanging offence" for the military to spy on the president and Ehrlichman's assistant, Egil ("Bud") Krogh thought that it was the beginning of a military coup because of the interference it represented "into the deliberations of duly-elected and appointed civilians to carry out foreign policy." Radford's confession not only led to such dire evaluations, but also to the December 21 conversation among the president, Ehrlich man, Haldeman, and John Mitchell.

The most striking aspect of this tape is the passive role played by Nixon - the so-called original imperial president. First, he is out-talked by the others throughout this fifty-two-minute conversation. Toward the end of tape, the president can be heard saying to his advisers in a loud voice that the JCS spy activity was "wrong! Understand? I'm just saying that's wrong. Do you agree?" A little later he called it a "federal offense of the highest order." Up to this point, however, John Mitchell told the president that "the important thing is to paper this thing over" because "this Welander thing . . . Is going to get right into the middle of Joint Chiefs of Staff."

In other words, Nixon would have to take on the entire military command if he exposed the spy ring. Moreover, this expose would take place in an election year and when the president had scheduled trips to both China and the Soviet Union to confirm improved relations with these countries - which the military opposed. Taking on the military establishment with such important political and diplomatic events on the horizon could have proven disastrous for the president's most important objectives and revealed other back-channel diplomatic activities of the administration. Later in his memoirs the president said that the media would have completely distorted the incident and exposure would have done "damage to the military at time when it was already under heavy attack."

In contrast, at the time all three men agreed with Nixon about the seriousness of the crime committed by the JCS. Mitchell even compared it to "coming in [to the president's office] and robbing your desk." However, they advised him to do no more than to inform Moorer that the White House knew about the JCS spy ring, to interview Welander (who was later transferred to sea duty), and to transfer Radford. Moorer subsequently denied obtaining any information from purloined documents, fallaciously claiming that Nixon kept him fully informed about all his foreign policy initiatives. If this had been true there would have been no need for Moorer to set up a spy ring. Welander, for his part according to this tape, had initially refused to answer questions about the spying he was supervising on the questionable grounds that he had a "personal and confidential relationship" with both Kissinger and Haig.

Nixon became incensed when he heard this. "Just knock it out of the ballpark, stop that relationship," he told his aides on December 21. Subsequently in his first interview Welander admitted his role in the naval surveillance operation, and implicated then Brigadier General Alexander Haig, Kissinger's aide and liaison between the Pentagon and the White House, in this criminal operation. Haig ultimately prevailed upon his old friend and colleague Fred Buzhardt, general counsel to the Defense Department, to re-interview Admiral Welander and eliminate the compromising references to him. Still the existence of this first Welander interview continued to haunt Haig because he knew if the president found out there would be no more military promotions for him, let alone a future in politics and so he was determined to see that his role in this affair remained under raps.

Haig has succeeded in covering up his involvement down to the present day. For example, he told an interviewer in 1996 that the whole JCS spy ring was nothing more than the normal kind of internal espionage that goes on all the time among executive branch departments. Nonetheless, after he became Nixon's chief of staff, he went to great lengths to ensure that the various congressional investigations never concentrated on the Moorer/Radford affair, thus preventing exposure of his involvement in spying on the NSC while Kissinger's aide. When caught in the tug-of-war between the Joint Chiefs of Staff and the White House, Haig's loyalties to the very end remained with the military.

This December 21 tape also indicates that Nixon did not trust either Kissinger or Haig. At one point he stated that "Henry is not a good security risk" and that he was convinced that "Haig must have known about this operation . . . It seems unlikely he wouldn't have known." Yet after Watergate forced the resignations of Haldeman and Ehrlichman, Nixon appointed Haig his chief of staff! Had the president chosen to ignore the advisce of his closest aides in December 1971 and follow his own instincts about exposing the JCS, Haig's culpability would have become evident and his career under Nixon would have ended and quite possibly prevented him from serving in both the Ford and Reagan administrations.

By covering up JCS spy ring (but letting the military know they knew about it) Nixon and his aides apparently deluded themselves into thinking they would have greater leverage with a hostile defense establishment. However, the JCS also knew that Nixon and Kissinger had been by-passing both Secretaries of State (William Rogers) and Defense (Melvin Laird) in making their foreign policy decisions and could have retaliated with the charge that civilian leaders had been deliberately ignored in the administration's back-channel processes.
This successful cover-up of the Moorer-Radford affair set the stage for more minor cover-ups ultimately culminating in the mother of them all - Watergate. As a result it should be considered the first and most important of the Nixon cover-ups. Had it not take place perhaps Nixon would have survived his second term in office.

Joan Hoff is Distinguished Research Professor of History at Montana State University and author of Nixon Reconsidered (Basic Books).