Sunday, January 31, 2010

The White Ribbon


In a recent post, I lamented the frenetic violence and universal despair that fill so many recent movies. And in my first year of blogging, I commemorated the death of Ingmar Bergman, who perhaps first gave many Americans cause to be conscious of cinema as an art form.

In the recently released movie The White Ribbon ("Das Weisse Band"), director Michael Haneke demonstrates that both violence (mostly off-camera) and despair can be legitimate subjects, topics that leave the audience thinking and talking, rather than merely entertained and/or depressed. His cinematography recalls that of Bergman -- black and white chiaroscuro, stationary camera, long periods of virtual silence that help to develop mood and character. The black and white photography is the most beautiful use of that medium that I've seen in any recent film, at times even reminding me of the classic photography of wilderness scenery by Ansel Adams.

The movie takes place in a small, isolated German hamlet, immediately before World War I. The social structure of the village is dominated by the baron and his family, who employ most of the villagers; the Lutheran pastor; and, to a lesser extent, the village physician. The story is told from the perspective of a young, newly-arrived schoolteacher, who, decades later, as an old man, relates his recollections.

Strange, violent events occur throughout the film. The responsible parties are never identified, although at least two possible scenarios emerge. The children of the village seem, at least at first blush, to be perfectly normal kids for their time and place. But throughout the film, they are seen largely as a unit -- a swarm of children, childlike but somehow threatening. Like a Greek chorus, they observe the life of the village and the disasters that ensue. But they are a silent chorus, one that observes but does not comment. This image as dramatic chorus is echoed in the final scene of the film as the boys of the village, gathered in the church balcony, hover over the still stoic but now devastated congregatation, singing the ironically triumphal hymn, A Mighty Fortress is Our God.

Evil obviously haunts the village, but no one person appears evil -- individual villagers appear merely weak, scared, vain, immature, or powerless. Village society is tied in knots by repression -- sexual repression, class distinctions, social conventions. (The schoolteacher good-naturedly keeps reminding the girl to whom he's betrothed to stop calling him "sir" -- in German, I suspect he's asking her not to use the formal form of address.) The adults are so inhibited in all these respects that they find it impossible to share with each other their real feelings: impossible, at least, until a dam occasionally breaks and they blurt out their long-repressed emotions in ways that hurt each other devastatingly.

The parents love their children, but can show their love only though draconian punishments, in the hopes of rearing them "correctly." (The pastor, after beating his "misbehaving" children, forces the two oldest to wear white ribbons on their arms, to remind themselves of the purity to which they must aspire.) The baron and his family stay wealthy by exploiting the villagers, lower classes whom they see as only a step or two above the farm animals, but at the same time they attempt to treat them fairly. The villagers both respect and hate the baron. The pastor repeatedly reminds his congregation of their sinfulness and their duty to live righteously. He treats his own children -- whom he obviously loves -- no differently from the rest of his congregation. The village doctor is both kind to his patients and filled with a contempt for his long-time mistress that he finally reveals in a horrific display of verbal cruelty.

This is the picture of a dysfunctional world, as dysfunctional as the worlds shown in today's pop science fiction movies. Out of such dysfunction, evil would seem to flow naturally. As it does, although we never learn precisely how or through whom. We are reminded -- not explicitly by the movie itself, but by some reviewers -- that the children who stand around and watch impassively as evil unfolds -- or perhaps are even themselves the agents of evil -- were to grow into the German generation that gave the world Adolph Hitler.

But we also see ourselves in the villagers. As would any other nation, perhaps; and any other generation. The film reminds us of something uncomfortable about our very humanity itself, not simply about the German people, and not simply about quasi-feudal society before World War I.

-----------------
Winner, Palme d'Or, "Best Picture," 2009 Cannes Film Festival

Friday, January 29, 2010

Lack of restraint


One of the "highlights" of the State of the Union address -- from the perspective of journalists -- was President Obama's sharp criticism of the Supreme Court's recent 5-4 decision in Citizens United v. Federal Election Commission, 558 U.S. ___ (Dkt. No. 08-205, Jan. 21, 2010), and, even more delectable, Justice Alito's facial reaction to the criticism.

Justice Kennedy's opinion on behalf of the Court in Citizens United is 64 pages long (in slip opinion format), accompanied by separate concurring opinions from Chief Justice Roberts and Justices Scalia and Thomas -- each with a different take on what was being decided -- and by a 90-page dissenting opinion from Justice Stevens on behalf of the four so-called liberal justices. The combined length of this multitude of opinions is 183 pages. The whole mess will be analzyed at great length in law review notes and articles over the coming months and years. I won't presume to attempt an analysis in a multi-paragraph blog posting.

I just want to comment on a basic point brought up in Justice Stevens's dissent, and on its implications. Justice Stevens points out, at some length, that the Court generally follows certain rules in accepting and deciding appeals -- it avoids overruling its own prior decisions unless circumstances have changed sufficiently to make such an overruling necessary, and it decides issues as narrowly as possible (e.g., it doesn't declare an entire statute unconstitutional if the statute can be saved by invalidating just one clause). Neither rule is unbreakable, but the Court generally gives a good reason for breaking them. (For example, in 1954, in Brown v. Board of Education, overruling an 1896 opinion upholding equal but segregated schools, the court held that events since that date had made it clear that separate schools were inherently unequal.)

In Citizens United, the Court stretched to resolve a matter that it did not have to resolve, and it overruled a century of developing constitutional law, including a decision it had handed down as recently as 1990, and a portion of one decided in 2003; it also, in effect, overruled all subsequent decisions relying on the 1990 ruling. As Justice Stevens points out:

The only relevant thing that has changed since [its earlier contrary decisions in] Austin and McConnell is the composition of this Court. Today’s ruling thus strikes at the vitals of stare decisis, "the means by which we ensure that the law will not merely change erratically, but will develop in a principled and intelligible fashion" that "permits society to presume that bedrock principles are founded in the law rather than in the proclivities of individuals."

What's interesting is that the rallying cry of both political conservatives and judicial conservatives has long been "judicial restraint." By judicial restraint, lawyers mean judicial non-activism -- deference to the decisions of Congress and reluctance to hold legislation unconstitutional; adherence to existing law and past judicial decisions when interpreting and applying that law; insistence on resolving any specific appeal on as narrow a ground as possible, giving maximum deference to the fact-finding functions of the trial judge.

That the Roberts court ignored these principles in Citizens United, suggests that the Roberts court is not a "conservative" court but a "radical" court. It is unwilling to let the law develop in a slow and orderly manner. It is unwilling to give deference to its own past precedents. It is unwilling to give deference to Congress.

It thus shares certain characteristics with the "liberal" courts of the past, such as the Warren court that abolished segregation in schools and, subsequently, in other aspects of public life. Its "conservatism" is not a judicial conservatism, but a radical political conservatism. The Roberts court is willing to bend the rules, to be activist, to overrule Congress, and to rebuke trial courts -- not in pursuit of individual rights and liberties, however, but in order to promote the business and political interests of certain elite segments of the population, segments of the population that seemed to be doing just fine without the aid of the Supreme Court.

The Court has returned, moreover, to an approach made infamous in the early 20th century -- that because the law chooses to treat corporations as "persons" for certain purposes, in order to protect their investors from unlimited financial liability, corporations must therefore be treated as persons for all purposes. Corporations, therefore, are to be guaranteed first amendment rights identical to those protecting you and me and Mr. Jones across the street. The Court abandoned this approach at least by 1936, during the New Deal. Who'd a thunk it would return to haunt us in 2010?

For any voter tempted to throw up his or her hands at the frustrations produced by our political system, to shout "a pox on both your parties" and refuse to vote -- I suggest that he or she consider how much difference the appointment of just one new justice can make to the future direction of our legal institutions -- and of our Nation.

The look on Justice Alito's face while listening to the State of the Union was the infuriated expression of a politician being challenged It was not the dispassionate restraint of a scholar and judge.

Wednesday, January 27, 2010

Trekking memories





































































I finished off my little exercise against the combined forces of local high school students this evening. Too exhausted to write, but nevertheless tingling with the urge to blog, I post a few photos from last October's Annapurna trek -- mostly not duplicating the ones posted on Facebook -- for my audience's amusement.

Monday, January 25, 2010

Cat people


That guy next door with the big golden retriever is friendly and gregarious. The woman across the street with the three Siamese cats is obsessive, anxious and secretive. Those are the stereotypes. Any basis in fact?

The answer is "yes," according to a University of Texas professor of psychology. Sam Gosling has published a study showing personality differences between "dog people" and "cat people."

According to his findings:

  • Forty-six percent of respondents described themselves as dog people, while 12 percent said they were cat people. Almost 28 percent said they were both and 15 percent said they were neither.
  • Dog people were generally about 15 percent more extraverted, 13 percent more agreeable and 11 percent more conscientious than cat people.
  • Cat people were generally about 12 percent more neurotic and 11 percent more open than dog people.1

According to the study, people who say they like both dogs and cats are essentially dog people, for the most part, with some neurotic traits.

This study tells me all I need to know about both popular psychology and the University of Texas. (I assert, neurotically.)

As I've mentioned, I prefer cats, and I live with cats. I also like dogs. I suspect that I actually relate to dogs with greater empathy than do their own masters, who themselves are too busy running around being extroverted and agreeable and wagging their own tails to have developed any real understanding of the emotional needs of their adoring pets.

I maintain -- and do stop me if I've told you this before -- that dog people like dogs primarily because they can dominate them. Even if a guy spends all day at work staring at a computer screen and saying "yes sir" meekly to his boss, he can come home and be the alpha male in his dog's small pack. He can yell at the dog; he can put the dog on a leash and drag him around. His dog is also appealing, I suspect, because of a faux parent-child dynamic. The dog, he fantasizes, is that unusual child who is loving and unquestioning and obedient, and -- above all -- respectful.

Cats are esthetically pleasing, of course. They're graceful. They're clean. More important, their personalities are everything that a dog's is not. Your cat relates to you as an equal. He will, of course, gladly accept your food and your bed, but only rarely your rapturous hugs. He offers you his friendship only when a certain level of mutual trust has been attained. When you come home from work, don't expect your cat to rush up and grovel with joy at your arrival, any more than he would expect you to pounce on him with abandon as he enters the cat door.

A cat is an adult, and he honors you by presuming the same of you.

What a cat will do, with half-open eyes, is study you all evening while you're reading or computing, and then, when the proper moment arrives, steal up next to you and brush his tail against your leg. He will sit on your lap, when he so chooses, and reach around the book you're reading to brush his paw against your face. He will observe the signals, and silently move upstairs ahead of you as he sees you prepare for bed. A cat's affection is more moving than a dog's, because it's based not on some genetically programmed urge to submit to the rule of the leader, but on a gradually developed sense of affection and trust, arising out of his experience with you and yours with him.

Dogs are fun. They're great company, and obviously are far better designed than cats for going out for a romp in the countryside. But friendship with a cat is an accomplishment, an achievement that pays emotional rewards for years, for the rest of the cat's life -- a life that is fortunately longer than that of a dog.

And if all this proves me neurotic, well, I can live with that.

------------------
1University of Texas at Austin News (Jan. 13, 2010) The article has drawn a large number of on-line comments, almost all negative, including a claim that the published paper is an embarrassment to the University of Texas. The comments are far more entertaining (and intelligent) than the the article itself. (Almost half the comments concern the spelling of "extraversion." The word is properly spelled with either an "a" or an" o.")

Photo above: Loki, age 5

Friday, January 22, 2010

Trial by student body


If I hadn't become an attorney, as I've intimated in earlier posts, I might have drifted into teaching. I've also acknowledged that the realities of trying to teach bored, apathetic youngsters probably would have quickly exhausted my limited reserves of patience and empathy for people radically different from myself, as well as my tolerance for prolonged frustration.

In other words, I doubt I could have handled the kinds of students who desperately need skilled teaching -- the kinds of students who aren't apt to find a way to learn on their own anyway, and for whom formal teaching is almost superfluous. If were to teach successfully, the kids I taught would have to be the right kind of kids.

The high school students I worked with Wednesday, helping them prepare for their mock trial competition in March, were the right kind of kids.

When I arrived at the high school at 5 p.m., a large group of students was already chattering in front of the locked library door, waiting for the adviser to arrive and let them in. By 5:10, virtually every student signed up for the program -- probably 30 at least -- had arrived. The adviser, a young attorney from a downtown firm, has spent every Wednesday evening (and some weekends) of his own time since October working with these kids. He obviously enjoys working with them, and they were relaxed and friendly with him. Watching this attorney and the kids interact, I couldn't think of any way the legal profession could possibly present a more likable face to the community.

The kids are divided into two prosecution teams and two defense teams. Wednesday's "scrimmage" cast me as an assistant U.S. Attorney, trying the case against one of the two student defense teams, with the adviser as judge, ruling on motions and objections. We presented opening statements to the jury, and then direct and cross examination of the government's four witnesses. Next Wednesday, the defense will present its own four witnesses (including the defendant) for direct and cross, and we will wind up the trial with short closing arguments.

Those students from the three teams not directly participating in Wednesday's scrimmage occupied the jury box, ran several video cameras, or simply watched and took notes for their own use later.

This same "scrimmage" process will be repeated, in the weeks to come, with other practicing attorneys working with the other teams.

The kids were well prepared, enthusiastic, articulate, intense and serious while playing their parts. They were funny, exuberant, and in typically high teen-aged spirits before and after the "performance." They are the kinds of kids you'd expect to find involved in orchestra or debate or the school newspaper, if not signed up for a mock trial competition. They are the kinds of kids most parents hope for.

I was delighted, and I was impressed.

---------------------
Photo is a stock photo of mock trial participants, not one of the Seattle participants

Wednesday, January 13, 2010

Fleetly-presented Homer


One of the many advantages of living near a major university is the ability to attend events aimed at the non-student population. Last night, a friend and I attended the first lecture in a three-part series sponsored by the University of Washington Alumni Association, entitled The Treasures of Greece. The speaker was Carol G. Thomas, a UW history professor specializing in classical studies.

Sometimes these series are outstanding, sometimes merely interesting. After hearing last night's lecture, I suspect that Professor Thomas found herself being talked into delivering an extremely broad topic in a very limited period of time. Last night's two-hour lecture was to cover Greek history up to the classical period; next week's will "do" the Hellenistic era; and the third and final lecture will cover Greece in modern times. In light of the fact that Professor Thomas began her lecture last night twenty minutes late, that there was a 15-minute intermission, that there were persistent problems with her microphone and the visual projection equipment, and that 15 minutes were reserved at the end for questions from the audience -- well, there wasn't much time remaining in a two hour period to discuss Greece in all its splendor.

Professor Thomas asked and addressed the question of whether Homer's world really existed, an issue that I thought had been fairly clearly answered, at least in part, since Schlieman's excavations at Troy early in the last century. Nevertheless, she did show some interesting photos of the excavations. She also discussed some recent discoveries in ancient Hittite writings -- the Hittites had achieved a form of writing while the Greeks were still pre-literate -- that appear to confirm the existence and economic importance of a number of locations in Asia Minor that are described in the Iliad.

She also played a recording of an excerpt from the Iliad recited in ancient Greek, demonstrating the rather hypnotizing effect of poetry composed in dactylic hexameter, and she reminded us how the catch phrases that are repeated over and over in Homer -- "rosy-fingered dawn," "wine-dark sea," "fleet-footed Achilles" -- together with the regular meter helped ancient bards to memorize lengthy epics like the Iliad before writing was available.

The UW, until recently, presented these history series almost every academic quarter, usually eight to ten lectures per series. It's unfortunate that they are now being offered only once a year, and that the series this year has been truncated to three lectures.

Greek history before the classical era is, taken alone, a fascinating subject, and a subject of general appeal -- an appeal revealed by the large attendance last night, completely filling one of the largest auditoriums on campus. It's unfortunate that the university is unable to provide a lengthier and more in-depth series on the topic, aimed at general alumni audiences.

Monday, January 11, 2010

High school lawyers


So, do we really need lawyers? That's the question mankind has asked for generations. Could even high school students -- given a year's part-time training -- be qualified to try a lawsuit?

I'm mulling over a fascinating pile of paper that's been prepared for use in an upcoming high school mock trial. Before me, spread out over my dining room table, is a summary of facts, lengthy statements by eight witnesses, some jury instructions, several exhibits, and a federal grand jury indictment of a one-time radical environmental activist.

Because the matter is pending before the court, as we lawyers like to say, I won't go into any detail. I'll just say that the indictment is for an alleged arson committed against a fictional Washington research university, and that it was allegedly motivated by a professor's genetic engineering studies. In general, although not in specifics, the case resembles an actual arson that occurred a few years back in Seattle.

My involvement is fairly low key. I am putting together my version of the federal government's case against the defendant, and will spend two evenings, 2 ½ hours each evening, acting as the U.S. Attorney, opposed by students acting as attorneys for the accused environmentalist. This will be only a "mock" mock trial, one of several the students will put themselves through, intended to help them get ready for the actual interscholastic competitions that will occur in the spring.

I am completely impressed. This particular high school has a middling reputation in the city: not a bad school, but not one of the two or three outstanding schools, either. I asked the moderator this morning how harsh I should be in objecting to improper questions and inadmissible evidence. He told me to forget entirely that these were high school students. He said that the kids have been studying the federal rules of evidence since last spring, and that I should treat them with no more mercy than I would opposing counsel in a real trial. The students want to know every kind of objection and argument that might be thrown at them when they get into the actual competition.

These young people have been giving up an evening of their time, once a week, since last spring. They will continue to do so until the competition. They are doing it not for a class grade, but for fun, glory, excitement, experience -- and perhaps for the résumés accompanying their college applications.

Whenever I hear comments to the effect that "kids nowadays are no damn good," something like this comes along and knocks me off my feet. I'm working like crazy to be a worthy sparring partner when I confront them for my first meeting with them next week. I'm eagerly looking forward to seeing how they perform. I'm happy for them, and with the schools and YMCA organization that have made this learning experience possible.

I'll give you my impressions of the experience later. Meanwhile: "May it please the court ... "

Saturday, January 9, 2010

War against language


...Mr. Obama, the man who would reform health care, is also a war president, and one who has not yet proved to Americans that he can be a success at that.
--The Economist (Jan. 9-15, 2010)

Regardless of what Americans may believe about our president's ability to combat terrorism, Obama is not a "war president."

The United States is not, of course, in a formal state of war, but this is not decisive. Congress has not formally declared war since World War II, but no one would doubt that we were at war in Korea and that we were at war in Vietnam.

Our involvement in Iraq and Afghanistan is more questionable. With whom, exactly are we at war? It would be far more accurate to state that we have intervened in those two countries to maintain order, and/or to chase after a group blamed for 9-11 and/or to take sides in civil insurrections.

But the "war on terrorism" ("terrorism" itself being a loaded term, meaning whatever the politician using it wants it to mean) is no more a "war" in any meaningful sense than is the "war on drugs" or the "war on illiteracy" or the "war on littering." I don't mean to be flippant. Obviously the stakes are high and the risks are deadly. But war, as understood in American and international law, refers to a specific hostile relationship between two sovereign nations. The "war against terrorism," which is not even aimed at any specific geographical area, resembles more closely a form of criminal enforcement, such as the battle against the Mafia or against the drug syndicates.

The reason why the definition is critical, the reason why we can not permit such a "war" -- a use of force against individual persons and their conspiracies that resembles a war only metaphorically -- to be considered war in any legal or constitutional sense, is that sloppy definitions permit sloppy reasoning and sloppy justifications. The Bush regime was able to push through a large number of questionable statutes, regulations, and policies under the mantra "We're at war!" How often did we hear Cheney say, in so many words, "I don't think you understand, sir. This country's fighting a war!"

Since the Bush administration contended that the "war against terrorism" would last years, decades, perhaps forever, it became clear that supposedly temporary infringements on civil rights -- such as the so-called Patriot Act -- were intended to become permanent. (Fortunately, in a narrow 5-4 decision, the Supreme Court in 2008 rejected attempts by Congress and the administration to strip the federal courts of jurisdiction to issue habeas corpus on behalf of terrorist suspects being held at Guantánamo.)

In his novel 1984, Orwell wrote of a despotism that controlled its citizens, in part, by eradicating from the English language all words that would enable them to even think in seditious terms. Language is important. Words are important. The correct use of words is important.

The Nazis, in a nice Orwellian touch, hung that famous sign over Auschwitz: "ARBEIT MACH FREI" -- Work makes you free. It didn't, at least in any way that the inmates might have hoped. We ourselves may be engaged in a critical struggle against terrorism. But we are not "at war" against persons and their organizations that commit so-called terrorist acts. And using the word "war" doesn't make it so.

Tuesday, January 5, 2010

New year gloom


January, the month of beginnings. The hopeful New Year baby, cheerfully pushing aside the tired Old Year graybeard. A time to be enthusiastic, optimistic.

And yet, I'm depressed. Not for my own life, which is purring along quite smoothly, thank you. But for the civilization in which I live.

Is it just me? Don't you have the feeling that nothing is working out right? That we're not just going through a downbeat phase, but that we are well advanced into a long-term era of secular decline? That we -- you and me, the folks apt to be reading this post -- resemble those reasonably well-off Romans in the Late Empire, nice folks who went about worrying about getting promotions and helping their kids get into good schools, not noticing what was happening around them? That the economic foundations of their civilization had already collapsed and that they were living off the assets of the past, that their culture was increasingly debased, that the barbarians were storming the gates -- maybe not yet gates close at hand, but Roman gates they had never stormed before?

I don't mean by all this just that I'm perturbed that the Pac-10 and Big Ten did so poorly against the SEC and other boorish, no-account conferences, although that hasn't improved my mood any. No, my thoughts turn to even weightier matters.

For example, in yesterday's New York Times, columnist David Brooks argues that the average American has lost confidence in our political institutions, in our scientists, in our foreign policy, and in our business leaders. Most of all, the average American has totally lost confidence in what Brooks calls "our educated classes." "Every single idea associated with the educated class has grown more unpopular over the past year," he claims.

I've spent too much time watching football this past week, which has given me an exposure to the world of television that I usually lack. The commercials! Ad after ad glorifies an image of the American citizen, and especially the adult American male, as an idiotic, loud-mouthed adolescent -- an ersatz teenager, with all of a teenager's understandable immaturity and gross behaviors, but with none of a true teenager's hopes of growing up and becoming educated into a more intelligent and sensitive human being.

I especially marvel at the televised trailers for upcoming movies, movie after movie displaying an obsession with bleak post-apocalyptic dystopias, cops gone wrong, autos consumed by fireballs, and heroes (or are they villains? who knows?) shooting, slugging, slicing, blowing up, vaporizing and otherwise eliminating everyone who gets in their way, as though human beings were disposable adversaries in a computer game.

Look at the films by major studios opening in January alone: Daybreakers (plague has changed most citizens into vampires); Book of Eli (father and son try to survive in a bleak post-apocalyptic world); Legion (God's fed up and has sent angels to terminate his experiment); Dread (college student taking part in case study uses it to play on the fears of his peers); Edge of Darkness (cop on rampage -- supposedly a good cop, but I didn't like the looks of the trailer); Bitch Slap (no comment). Some of these may be artistically sound movies, and some may be movies I'll actually decide to watch, but that's not my point. The point is that similarly themed movies come at us month after month -- and that movies with these violent and grossly pessimistic themes tend to be the films that best succeed commercially. That fact, I submit, shows something disturbing about our popular culture -- and our popular culture is the measure of our civilization.

I'm worried and I'm depressed. But I'll try to look on the bright side: we still don't entertain ourselves by watching humans being fed to the lions, although Texas's obsession with the dealth penalty does give me pause. Someone will write me that I should just not watch movies or TV if it's going to disturb me all that much. You're right, I'm free to choose my own amusements, just as the more finicky citizens of Rome were free to write poetry and play the lyre and ignore what was going on downtown in the Coliseum -- but shutting their eyes to what was happening to their own civilization didn't save the Empire, did it?

And now, if you'll excuse me, I have to check my local newspaper and see which theaters are showing Avatar this weekend.

Friday, January 1, 2010

What's in a name?


Humbug or not, Happy New Year! everybody. In fact, Happy New Decade.

Not strictly, of course. Just as the twentieth century didn't end until the last day of 2000, so the first decade of the twenty-first century won't end until the last day of 2010. But only clueless geeks care about nitpicking stuff like that. (Which is why I myself point out the technically correct date to everyone at every opportunity.)

Whatever the technicalities, we all "know" that we are now beginning a new decade, just as we all "knew" on January 1, 2000, that we were beginning a new century. But what shall we call these first two decades? From the last century, we know all about the roaring 20's, the depression 30's, the apathetic 50's, the hippy 60's, etc. But what simple word can be used to describe decades beginning with "0" and "1."

Actually -- I have on good authority -- a century ago most people called the 1900-09 decade the "aughts." "Aught" being a word synonymous with zero, and familiar from old stories where some elderly geezer leans back in his chair and reminisces: "That was a bad one, it was. Yes sir. The Great Blizzard of 19-aught-6." Parenthetically, "naught" means exactly the same thing as "aught," although one might suppose it a contraction of "not aught." And yet you never hear of an old fool ranting on and on about the great blizzard of 19-"naught"-6. This, my friends, is attributable to "fashion," and no rational explanation should be sought.

Obviously, then, it thus makes perfect sense -- and is in full accord with precedent -- to call the decade just completed the "aughts." The "Awesome Aughts"? Or "Awful Aughts"? Take your pick, depending on your own temperament, but I'm not aware of any such adjective attached by our twentieth century forebears to "aughts." They may just have been on to something. And if "aughts" sounds too archaic or hard to spell, let's just call them "the Bush-Cheney years" (or maybe "Cheney-Bush years"), and leave it at that. Which is to say "awful," and which is what I suspect we'll end up calling them anyway.

But I digress. I really meant to discuss the upcoming decade, the one we've just started (although you and I can secretly agree that it doesn't really start until 2011). Here I find no precedents. I've never read or heard of anyone calling the years 1910-19 the "teens"1 or the "tens." And I suspect I know why.

World War I began about half-way through the decade. The war ended one era and introduced a new one. It sharply divided the decade into two segments: the pre-war years, which in America were really part of Teddy Roosevelt's Progressive Era that that began at the outset of the "aughts," and the war years which ended Progressivism in this country and pushed our society down a slippery slope into the infamous "Roaring 20's." In Britain, the pre-war years were just a continuation of the Edwardian era (on that side of the pond, they do tend to identify eras more with reigns than with decades); and the war years -- which began in 1914 in Britain, rather than 1917 as here -- which led directly into that unpleasantly messy transitional period that Britain experienced in the 20's and 30's. So to both the British and the American publics, the "teens" didn't really exist as an independent decade, but as the tail end of one decade and the introduction to another.

Barring an unforeseen disaster, the coming decade will not be so bifurcated. So what should we call it? And what will future generations call it? I suppose, most logically, simply the "teens," but our society has ways of spontaneously coming up with terms that no one could have predicted, so who knows? And that's my answer to the question posed by this essay: Who knows?2

I will stick my neck out with an unrelated prediction, however. I'll bet that by 2013, we will stop using the awkward term "two thousand thirteen," and will simply say "twenty thirteen." (Remember, you read it here first.)
--------------------------------

1(1-2-10) Although, embarassingly, I read such a reference to the upcoming decade in this morning's New York Times financial section!

2(1-2-10) Even more embarrassingly, this afternoon's mail brings the 1-4-10 issue of The New Yorker, whose lead comment under "Talk of the Town" is entitled "What Do You Call It." The comment discusses the issue of the "aughts," but of course does so far more eruditely than could I. The writer (Rebecca Mead) observes that "aught" is a nineteenth century corruption of "naught," summarizes the highs and the lows of the past decade, and concludes that to call the past decade the "aughts" pleases no one, and that the decade remains "an orphaned era that no one quite wants to own, or own up to.

So I'm not The New Yorker. So screw it. Sure, you're far better off reading their writers. But then, I don't charge you $5.99 per copy either.