Sunday, January 30, 2011

A shining city


"Exceptionalism."

In a syndicated column today, Kathleen Parker notes that Obama's State of the Union address omitted "the word" -- the word that, as Parker sees it, will be a rallying call for the Republicans in 2012.

"The word conservatives long to hear."

Parker feels that Obama does believe that America is "exceptional" -- parts of his speech make it clear that he does -- but she wonders why he shies away from pronouncing the actual word. She says that Americans want their president to share their values, and 36 percent fear that Obama does not believe that America is "exceptional." If Obama doesn't like the word as it's now being used, she contends, he had better "take possession of the word," make clear to voters what the word does mean to him in a way that resonates with those voters -- and then make use of it often.

Her advice may be good politics. But I think I understand what President Obama meant when he once said, speaking overseas:

I believe in American exceptionalism, just as I suspect the Brits believe in British exceptionalism and the Greeks believe in Greek exceptionalism."

Great answer, I felt, and still feel. Horrible answer, politically, says Parker. Too "Harvard." Too (I gather) tone-deaf, too lacking in patriotic resonance.

Like most of us, I grew up being educated out of text books that were virtual Bibles of exceptionalism. All of world history was but prelude to the settlement of North America, the American Revolution, Manifest Destiny, our triumph in world wars, our bestowing the benefits of capitalism and democracy upon a benighted world, and, ultimately and incidentally, America's deserved and benign world domination. Just as generations of British kids were educated by reading Macaulay, absorbing his optimistic Whig exceptionalism -- the sense that history was the story of the British people's inevitable progress from medieval darkness to their attainment in the nineteenth century (when Britain was deemed close to perfection) of personal freedoms, scientific enlightenment, parliamentary democracy and constitutional monarchy.

Like Obama, I have no doubt that we have developed certain habits and abilities in this country that are advantageous to us, some of which may be worthy of emulation by others. These traits, together with our enormous historical advantages of isolation from the rest of the world during our formative years and abundance of natural resources and empty land, helped us reach our economic position today. Inheritance of British political and legal traditions and good fortune in the intelligence and wisdom of our nation's founding fathers gave us our political advantages.

These are economic and political advantages that our ancestors fortuitously received, and that we today have fortuitously inherited. They are not grounds for self-congratulations, and they are not signs of our own wisdom, merit or favor with God, as extravagant claims of "exceptionalism" so often suggest.

We have been blessed by God with good fortune (as have other nations), but it's more than presumptious to believe that God has chosen the United States of America as a unique vessel to reveal his plans to the world. We are not a New Israel. We are not a holy nation, a chosen people set apart. And if we are, or ever become, a "shining city on a hill," it will be because we have made some hard decisions and accepted some serious sacrifices in order to be a nation worthy of that descripton. It will not be because that was our Divine Destiny.

And even if attained, being a "shining city on a hill" will never be a permanent status. Nations come and nations go. Empires rise and empires fall. (Ask the British about it.) We are not exempt from the natural flows of history.

In preparation for my trip to Iran, I've been reading books that attempt to explain the mindset of the Iranian people. These books remind us that -- although people everywhere have much in common -- every people does have its own characteristics, its own values, its own hopes and desires. Not every people shares our own aspirations, at least to the same degree that we imagine it does or should do.

We don't always know what's best for ourselves, let alone for others. "Exceptionalism," as it's being bandied about today, assumes that we do. That assumption, in itself, is enough reason to distrust and avoid the word.

The United States has had many advantages, both in its material prosperity and in its history and ideals. We can be satisfied with that, as can many other countries with respect to their own advantages and history. We don't need to insist that we are unique and that our uniqueness somehow lifts us above others or gives us a basis to tell others how to live. As Kathleen Parker notes, in her penultimate paragraph:

We mustn't brag, after all. Great nations don't have to remind others of their greatness. They merely have to be great.

Wise words. I would add only that great nations also don't have to remind themselves repeatedly of their own greatness.

Wednesday, January 26, 2011

Auguries of global warming


More snow in the Northeast corridor. A half foot last night, even as far south as Kentucky. Eight to 12 inches forecast for tonight in New York City. Massive flight cancellations yet again.

I strolled across campus here in Seattle today. Green shoots are popping up through the ground. Buds have appeared on all the flowering trees, some about to burst out at any moment into pink and white blossoms. A high of 57 degrees was expected today; tomorrow's high is expected to reach 60°.

Seattleites, New Yorkers:

Some are born to sweet delight.
Some are born to endless night.

Last week's New Yorker devoted its "Shouts & Murmurs" humor column to a roast of Mayor Bloomberg, deriding the New York mayor's aristocratic bemusement and detachment in the face of citizen outrage over his inept handling of snow accumulations in the city streets.

December 28th [his "diary" indicates]. The criticism mounts. Someplace called the Bronks (sp?) remains snowbound. Am I missing something? Yes, there is a lot of snow. Yes, we haven't plowed it. Yes, the subways and buses aren't really running so well. So why don't people simply use their helicopters? ... What is a snowstorm but an airborne snowing event? The American mystery deepens. Like a snowdrift. It's late and we are out of wine and my pajamas are itchy ...

In a snowless Seattle -- where a Seattle Times headline reads "Winter on Seattle's waterfront offers uncrowded fun (plus fish and chips)" -- our own ditzy mayor, free of weather concerns, can ponder idly the advisability of ending drunken street rowdiness at 2 a.m., when the bars close, by the simple expedient of eliminating the mandatory 2 a.m. closing hour. Presumably we'll find drunks more tolerable at 3 a.m. or 4 a.m.

In Washington, President Obama struggles by motorcade from the airport to the White House. The weather had grounded Marine One, the presidential helicopter. Here in the Northwest, on the other hand, spring began yesterday for fishing enthusiasts, as the first spring chinook of the season was caught in the Columbia. Nearly 200,000 of the salmon are expected to pass upstream to spawn, negotiating the hazards of rapids, fish ladders, and armies of avid fishermen.

Repeated stories about icy weather on the East Coast draw the same repetitious on-line comments: "You call this global warming?" "Wonder what Al Gore has to say about this?" The commentators ignore the second straight year of unseasonably mild winter in the Northwest, apparently uninterested in what this recent warmth might mean.

Meanwhile, the polar ice keeps melting, the polar bears grow more frantic, the mountain glaciers keep shrinking, and storms worldwide become ever greater and more vicious. Does no one understand that there's no incongruity between "global warming" and "local cold weather"?

Ah, I think to myself, to be a teenager once again. To wear shorts and t-shirts in January, oblivious to whatever future climatological horrors this odd weather may portend. But what the heck? Why should I care? Let's be glad we're not in New York. Walk in the warm air. Enjoy seeing flowers bloom in mid-winter. Why should only Californians live like Californians?

Let Mayor Bloomberg and the polar bears worry about where it's all leading.

--------------------------
Art work by Isaac Littlejohn Eddy, reproduced in NY Times (1-26-11)

Sunday, January 23, 2011

Bringing thread counts to Ladakh


My heart sinks as I open the New York Times travel section and spot the headline: "Bringing Luxury to a Rugged Himalayan Haven." Which area is doomed now, I wonder. Oh, fine, I should have known -- Ladakh!

Ladakh is a high-altitude Buddhist region, administratively incorporated within Indian Kashmir, an area so remote that the article describes it as:

a region formerly the exclusive province of trekkers and religious pilgrims willing to trade comfort for the hope of transcendence.

But no longer! The article describes its writer's travel with a tour company

that takes travelers used to high thread counts and high tea on treks through the Himalayas.

The tour, led by "an Eton-educated art history expert with a preference for ascots and roll-your-own cigarettes," takes its pampered guests to specially renovated rooms in local homes.

There were chef-prepared dinners, hot showers, and heavenly beds made with Shaki's own linens. Handmade soaps stood beside the copper bathroom basins.

Well, ain't that just ducky?

The article marvels that "places like Ladakh can still exist." How much longer does the writer believe that "places like Ladakh" will continue to exist, once truckloads of "handmade soaps" and "Shaki's own linens" begin arriving from the lowlands?

I trekked in Ladakh for a bit under three weeks in 2005. As the article says, the region is remote and, by Western standards, primitive. The landscapes are stunning. The people are poor -- again, by Western standards -- but spiritually rich. The region's capital, Leh, was already, in 2005, a trekking center with small, inexpensive hotels, pizza joints and cybercafés. Even we trekkers, by our very presence, were helping to change the lives and attitudes of the peoples among whom we trekked -- we perhaps blended in by our simple lives and meals, but not by our modern clothes and camping gear. But we at least trekked with a company that was dedicated both to minimizing our cultural impact and to spending our money on goods and services produced by local residents.

Travel in itself is homogenizing -- it changes both the travelers and the people they visit. Travel's impact is entirely favorable on the traveler; it can often be favorable on those visited. But the greater the disparity in appearances, including apparent wealth, between the two groups, the earlier the place visited will lose the unique qualities that made it worth visiting.

I understand that there is a selfish emotion involved here -- a desire to keep an area impoverished and primitive for the entertainment of the Western visitor. It's easy to call a people poor in assets but rich in spirituality, and to use that as an excuse for perpetuating poverty and hardship. But, on the other hand, comparison of societies before and after "Westernization" and influx of luxury tourist dollars leaves me unconvinced that the changes brought by tourism represent unalloyed improvement.

But Westernization will continue, everywhere, and will result in a more homogenized world. As tourists, however, we have some responsibility to ensure that our travel does not encourage the more undesirable aspects of this change, and, in so far as possible, encourages actual improvement in the lives of the people we visit.

I don't think that "bringing luxury" to permit more light-hearted travel for visitors to the Himalayas is the answer. Nor is it the answer, in the long run, to the interests of Western tourism. I'm reminded of the struggle to create the North Cascades National Park in my own state. Critics complained that park advocates wanted to "lock up" the wilderness, making it available only to "hardy hikers and mountaineers." (These phrases were repeated ad nauseum.) They wanted roads throughout the park, and trams to the more interesting summits. They seemed determined to overlook the fact that the attraction of the North Cascades consisted not merely in the scenery -- more easily viewed in picture books and movies -- but in the fact that the North Cascades was one of the few remaining true wilderness areas in America where humans could experience something of the solitude experienced by the pioneers, where a hiker could find himself forced to rely on his own preparations, skills and stamina to complete his hike.

I'm sorry to read of "luxury tours" to the Himalayas, just as I'm sorry to read of helicopter trips to Everest Base Camp and road building throughout the Andes. These changes may seem to promise, in the short run, greater opportunities to tourists and more money to local people; in the long run, they degrade the travel experience of those doing the visiting, and the lives and cultures of the persons being visited.

Wednesday, January 19, 2011

Medieval antecedents


The child is father to the man, so the saying goes. In the same way, the medieval world gave birth to the modern world of today. To understand why we act the way we do, both as individuals and as nations, we often need to look back to our childhoods.

The University of Washington Alumni Association's annual history lecture series this year is entitled "Medieval Origins of the Modern Western World," delivered by Prof. Robert Stacey. As a one-time medieval history major myself, I showed up for the sold-out series last night expecting a rather superficial summation of the more exciting events of the period, a number of anecdotes that might appeal to the average guy who's been out of school for a while.

I was pleasantly surprised.

The series contains just four lectures. I regret having missed the first one, entitled "The Oddity of the Modern West," while I was in California. This week's lecture discussed the origins of one such "oddity": "The Separation of Religion from Politics." Dr. Stacey's lecture was one of the best I've heard in the years I've attended these lectures at the UW. It was well delivered, highly organized, and crammed with information. The lecture managed to cover 1,700 years of history without our ever losing sight of the primary point that Dr. Stacey intended to make: that events and conflicts during the middle ages led ultimately to a sharp break with the past in the way Western man viewed the state -- how he understood the basis for the legitimacy of government and the objectives for which he believed government to exist. This break was centuries in development, but culminated in the aftermath of the religious wars of the 17th century.

In just under two hours of lecture time, Dr. Stacey discussed the fusion of worship and secular life in the Roman household and imperial government; St. Augustine's City of God, and the misunderstanding of his concept of the "two cities" by subsequent religious and political thinkers; the virtual fusion of priestly and secular functions in the Carolingian monarchy; the Investiture Controversy between the papacy and lay rulers of the 11th and 12th centuries, leading finally to a line drawn between lay and religious government; the Reformation conflicts and the theocracy of Calvinist Geneva; the Edict of Nantes; the Thirty Years War in the 17th century, emphasizing the religious rather than secular conflicts played out in that war; and the ultimate evolution of the state, in the wake of the moral exhaustion bred by the religious wars, into an entity that commanded allegiance and offered its benefits apart from the religious beliefs of either its ruler or its citizens.

As an undergraduate, I took a very good course in political theory, a course that covered these same topics; Dr. Stacey's single lecture pretty much summed up all the understanding (and more) that I took away from that undergrad class after the finals were over.

So props to the Alumni Association and to Dr. Stacey. I look forward eagerly to the two remaining lectures, "Limited Government" and "Love and Marriage."

Tuesday, January 18, 2011

Sitting up straight


My long-suffering piano teacher showed me a book this morning, a book written for grade school children. The book shows beginning pianists how to relax their arms while playing, so that they use their fingers, not their shoulders, to exert pressure on the keys. "One thing you never learned as a child, apparently," she sighed. "One of many things," I agreed.

I began thinking of the many things I never learned as a child. To what extent are our adult talents and abilities shaped and limited by what we learned or failed to learn when we were kids? Is it, ultimately, all Mom's fault?

If you've been awake this past week, you know about the Chinese-American author Amy Chua and her book, Battle Hymn of the Tiger Woman. She describes the draconian tactics to which she resorted in pushing her daughters to be the brilliant adults that they clearly are today. And competent musicians. One daughter performed at Carnegie Hall, but only after a childhood of forced four-hour practice sessions at the piano, and maternal threats to burn her stuffed animals if she failed to play a composition perfectly. (No, I haven't read the book, nor do I intend to read it. Life's too short. I'm relying on a story in the New York Times.)

Thoughtful readers of my posts may assume that Amy has my whole-hearted approval, but she doesn't. She has only my half-hearted approval. Her parenting techniques, as she describes them, fall at one barely acceptable extreme of the parenting continuum; the parent who sits before the TV every night and considers his or her child's education to be the function of the local school district would be at the other end. Somewhere in between lies the ideal approach, and I suspect that ideal balance varies from child to child. Unfortunately, the optimum balance is easier to locate in retrospect, after the kid's already grown up, than it was at the time the knowledge was needed.

The reader comments to the NY Times article, and to a follow-up column in the Times, vary widely in how they view Ms. Chua -- from admiration to suspicion of child abuse. The comments are all interesting and thoughtful, and the experiences each reader has had with his/her own children show the wide variety of child rearing approaches that can lead to the development of successful adult lives. Or, for that matter, to failures.

My own parents, like most in the milieu in which I grew up, took a rather laissez faire approach to bringing up their kids. More so than average, perhaps, even for our place and time. They wanted good grades from us, and college educations, but they pretty much left it up to us to figure out how best to achieve those results. That approach gave me an enormous amount of room as a kid for exploration and experimentation, for following my own sense of curiosity. It also gave me a lot of room to develop rather lazy habits of thought and poor self-discipline.

If I had my childhood to live over again, I'd hope for a bit more pressure from my parents, a setting of the bar of acceptable achievement a bit higher, rather than their relying entirely on my own sense of self-pride and ambition. On the other hand, I have to admit I'm glad they didn't refuse me access to the bathroom until I'd finished perfecting, to their satisfaction, the piano piece I was playing. As Ms. Chua cheerfully admits to doing.

I've known plenty of people whose parents tended toward either extreme. In general, those with the stricter parents have developed certain enviable talents -- like playing an instrument well -- more frequently than the others. But I can't really say that either approach by the parents has led consistently to adult lives that were either happier or more productive. Different kids respond differently to different parenting approaches. Parenting remains an art rather than a science.

This week's debate among readers of the Times articles is a microcosm of a national debate that will grow increasingly sharp in the years just ahead -- between those who feel that today's form of American education best develops the creativity and social skills that are needed for both personal and national success, and those who feel that the Chinese/Indian/Jewish model of hard work and, at times, rote learning in childhood teaches the self-discipline and analytical skills that Americans are increasingly failing to develop, especially compared to kids in the developing Asian nations that will be our competitors in the next decades.

As expected, I think both sides are right, and that what we need is a proper blend of the two approaches. But then, my own lackadaisical education taught me to be just that way -- non-judgmental to the point of sounding wishy-washy.

And the music still doesn't flow to my finger tips -- through relaxed arms -- as I strike the keys on the piano!

Sunday, January 16, 2011

Beyond the horizon


What do we really know about the universe in which we live? More importantly, perhaps, what can we know?

Brian Greene, a Columbia University physics professor, discusses today, in a New York Times feature article, the consequences of the fact that the universe is expanding at an accelerating rate -- not, as one would expect, at a slower rate. This accelerating expansion suggests that some force -- in addition to the original force of the Big Bang -- is more than counteracting the mutual gravitational attraction of all the matter in the universe. Dr. Greene describes this force as Einstein's posited "repulsive gravitation" or, in modern terms, "dark energy" -- something that is inherent in the nature of space itself.

Regardless of the mysterious nature of dark energy, its observed effect is that the universe is expanding, and expanding faster and faster. Eventually, this expansion will result in remote galaxies distancing themselves from each other at a speed faster than light.1 At that point, such galaxies will become permanently invisible to each other. In perhaps 100 billion years from now, our descendents on earth, gazing at the sky through telescopes, will see nothing but the stars in our own Milky Way galaxy.

Dr. Greene notes, somewhat whimsically, that such future astonomers, if confronted with ancient documents from earlier times describing galaxies as numerous as the stars themselves in our own galaxy, will have to decide whether to believe their own observations or the accounts from those documentary relics from the past. Accounts of a sky full of galaxies will seem as dubious to them as stories of the Greek or Hindu gods do to us today.

The moral is clear: we can never learn everything, or even very much, about the nature of reality. What critical information is already unknowable to us today because of analogous shielding of past events? Dr. Greene notes:

We've grown accustomed to the idea that with sufficient hard work and dedication, there's no barrier to how fully we can both grasp reality and confirm our own understanding. ... [But] sometimes the true nature of reality beckons from just beyond the horizon.

When considering how we "know" our world on the human scale, I'm always amazed at how our bodies perceive a narrow band of electromagnetic radiation ("light"), some neurological reactions to pressure on the skin ("feel"), and waves of compression in the atmosphere ("sound") -- and how our brains build a mental image of what lies around us out of this limited data, and how we then confidently believe that we have "observed" reality. How different would the "reality" of life on earth look if we perceived only ultraviolet light? Or x-rays?

And we are here as on a darkling plain
Swept with confused alarms of struggle and flight,
Where ignorant armies clash by night.
--Matthew Arnold

Dr. Greene's article makes it clear that we not only are limited in our perception of reality by the limitations of our sensory perceptions and our brain's ability to process all those perceptions that it does receive -- we are limited by the very nature of the universe itself.

We should feel a certain humility when we assert claims of absolute certainty about anything. But of course we don't. And won't.

------------------------------
1As Dr. Greene explains it, this expectation does not contradict relativistic limitations. Relativity applies to the motions of objects, relative to each other, within space. "Dark energy" is not causing objects to fly apart within space; it is causing space itself to rapidly expand.

Tuesday, January 11, 2011

Fanaticism


Today's New York Times carries an article discussing the growth of Islamic fanaticism in Pakistan. A Pakistani political analyst notes that the country has been drifting for some time into religious extremism.

This conservatism is fueled by an element of class divide, between the more secular and wealthy upper classes and the more religious middle and lower classes.
--New York Times, quoting Najam Sethi, a former Pakistani newspaper editor

The article discusses the recent assassination of the Punjab's governor by one of his own bodyguard, and the fact that younger and more religious members of the legal profession now consider the killer a hero. Lawyers showered the defendant with rose petals when he appeared in court to face charges. The growing religious fervor is reinforced by a strong strain of populism.

"Salman had an easygoing, witty, irreverent, high-life style,” he [Sethi] said, “so the anger of class inequality mixed with religious passion gives a heady, dangerous brew."

Moderate Pakistanis blame the religious parties and fundamentalist clerics for inciting such attacks. They also blame the press: “Democracy has brought us a media that is extremely right-wing, conservative,” according to Sethi.

While Pakistan is a far different country from the United States, Pakistan being a country that is actually on the verge of becoming a failed state, certain dynamics in both countries are uncomfortably similar. Here we have fundamentalist Christianity rather than fundamentalist Islam. This similarity should not be exaggerated; most Christian fundamentalists abhor violence and focus on individual rather than national salvation. We have certainly seen an uglier and more politicized side to that fundamentalism among some national religious figures, however.

This fundamentalism is reinforced by a growing populism -- not just in the benign sense of sticking up for the common man, but in an uglier sense of detesting all forms of authority -- government, business, unions, universities, academic experts, science, experts of any kind. This is a throwback to the populism of Andrew Jackson, but is a dangerous development today, appearing in a highly technological, non-frontier society.

We also have Fox News and certain of its commentators, media figures who have discovered that inciting hatred brings in far more readers and money than does dispassionate discussion of political issues. To Fox, both religious fundamentalism and anti-intellectual populism are trends to be cultivated, whether to advance its owners' actual political beliefs or simply to maximize profits. Beyond Fox, even, we have a plethora of blogs and message boards that are truly scary to read.

The assassinations in Tucson this week may well be the result of the killer's serious mental illness. Jared Loughner's political views, such as they are, don't really match those of any one extremist party or cult. But the hatred filling the media over the past decade -- the loathing of "liberals," the belief that the Constitution has been hijacked, the hatred of government in general, the bewildering (at least, to a liberal) fury at health reform legislation -- may well have given focus to Loughner's paranoia and emotional turmoil, leading him to target a Democratic member of Congress serving in a state not particularly receptive to Democratic party principles.

Regardless of any connection or lack of connection between Loughner and the radical right, the shooting should at least call our attention to dangerous currents within our society, and a growing decrease in our ability as concerned citizens to reason out political solutions using logic rather than falling back on abstract dogma, name-calling, and mutual loathing.

We still have a long way to go before we reach the plight in which Pakistan now finds itself. But let's not go any further.

Friday, January 7, 2011

Frontier talk


A fair-to-middlin' American writer named Ernest Hemingway once observed, famously, that "all modern American literature comes from one book by Mark Twain called Huckleberry Finn. …All American writing comes from that. There was nothing before. There has been nothing as good since."

Good old Hem's opinion hasn't prevented repeated attempts to ban the novel from American high schools. The reasons advanced have varied over the years. In earlier times, the problem was primarily with Huck's character and with the plot. Huck, the hero, is rebellious. He repeatedly gets the "nice boy," Tom Sawyer, into trouble -- and does so with the author's apparent full approval and enjoyment. He runs away from home and floats down the Mississippi with a runaway adult slave for his friend. (A prominent literary critic argued in the 1950's that Twain was suggesting an interracial, pedophile relationship!) He encounters a number of American frontier types, none of whom is in any way admirable or edifying.

At the end of the book, after returning to Aunt Sally and the world of respectability, Huck confides to his readers, in one of the more memorable final paragraphs of any novel, that he plans to run off again:

But I reckon I got to light out for the Territory ahead of the rest, because Aunt Sally she's going to adopt me and sivilize me, and I can't stand it. I been there before.

Not the sort of lesson that many adults wanted impressionable young students to learn.

Later, the concern seemed more with the texture of the novel. Mark Twain excelled in casting dialogue in the dialect appropriate for each of his fictional characters. Huck's language is full of grammatical solecisms. "Ain't" is used regularly and consistently. Jim, the slave, speaks in the black dialect of the time. Even the earliest critics of the novel complained of the "coarseness" of the language. It was, well, too realistic.

Many of these earlier complaints have continued up to the present, but the issue is now primarily "racism." Use of nineteenth century black dialect by black characters, such as Jim, connotes ignorance, it is claimed. Particularly when the novel is used in the classroom, it hurts the feelings of African-American students and suggests inherent black inferiority to the others. Why not have all the characters speak standard English, even if none of them would have done so in real life? Why, for that matter, I suppose, should Charles Dickens have differentiated between various classes and regions of England by writing each character's dialogue in an appropriate dialect?

All these arguments tend to be summed up in the horror often expressed with respect to the frequent use (219 times, according to someone's count) of the word "nigger" in Huckleberry Finn. (Note: I actually said "nigger" (there, I did it again!), unlike almost all recent journalistic reports on the subject. I didn't say "n-----" or "the n-word" or any other euphemism. I'm daring that way. I even italicized it for you.)

"Nigger," of course, began as a sloppy pronunciation of "Negro," which in turn is the Spanish word for "Black." Like most offensive words, there's nothing offensive in the word itself, but in the use that has historically been made of it. Dictionaries advise that the word was used in a derogatory fashion from the very beginning, but has become markedly more offensive in recent years. Fashions come and go in many words; once offensive words become accepted and once accepted words become offensive.

In pre-Civil War days, when the novel takes place, the word was widely used, even in "polite" society. It was derogatory even then, but it was -- after all -- applied to persons who were treated by the law as the chattel property of their owners. African Americans -- being mostly slaves -- were not highly esteemed in law, by society, or by the great majority of the population. As a result, there was not much concern over the nicety of the terms used to describe them. I suspect that "nigger" in those days was considered considerably less offensive than we would consider "spic" or "kike" to be today.

In any event, the word was in daily use.

My present concern stems from the well-publicized recent publication of an expurgated version of Huckleberry Finn, where the expurgator (Alan Gribben, an Auburn professor) has converted every use of the word "nigger" into the word "slave," in an apparent belief that it's better to be called a slave than a nigger. The book is intended primarily for use in the schools, where young ears and young ears' parents might be offended.

Should we give in and go along with it? Don't be silly. Huckleberry Finn is considered a classic for many reasons, all of which remain valid right up to the year 2011. You don't tidy up classics to meet changing tastes, any more than do sane people put plaster fig leaves on nude Renaissance statues. And modifying 219 appearances of one word won't satisfy the novel's vocal critics.

Today's issue of the Seattle Times contains a letter from a reader, responding to a news story about Gribben's "masterpiece." The letter concludes:

Our shared humanity is now beyond debate, and we have outgrown Huck and Jim. There are more modern books that accept the premise of "Huckleberry Finn" as given, and address the more subtle questions of race that still exist. Those books should be read and discussed in schools, and "Huckleberry Finn" should be honorably retired.

This well written and non-inflammatory letter seems to miss the entire point of teaching English literature. Mark Twain's novel is a story of the American frontier and of the conflicts confronted by an intelligent and perceptive, but uneducated, adolescent. It's not a sociological tract. Would the writer suggest that we now have a better understanding of religious and sexual values, and that therefore Hawthorne's rather disturbing novel, The Scarlet Letter, should be retired from the curriculum in favor of a book about a modern pastor who provides wise counseling to unmarried couples who are expecting a child?

Teachers, teach the classics; don't tamper with the text; and do discuss the books with your students. The kids aren't oblivious, but if there's any doubt in your mind as a teacher, go ahead and explain that "nigger" is not acceptable language today, and that it was insulting even in Twain's day. Also explain that "ain't" is dialect, not standard English, and that "sivilize" is just plain bad spelling. Get all that out of the way. Then talk about the lessons Twain was teaching in the novel. Tell the students how times, values and conventions were different from those they are familiar with today. And make sure they discover for themselves how -- all didactic intent aside -- the book is just an exciting, funny and entertaining read.

Oh -- and by the way -- Gribben also converts every use of the word "injun" to "Indian." (I guess Injun Joe now becomes Indian Joe.) Don't even get me started -- at least he doesn't call them "cardinals."

-------------------------
(1-9-11) Leonard Pitts, a syndicated, liberal, African-American columnist, in a well-reasoned column opposing Gribben's "modifications" to Huckleberry Finn that appears in today's Seattle Times, concludes:

Huck Finn is a funny, subversive story about a runaway white boy who comes to locate the humanity in a runaway black man and, in the process, vindicates his own. It has always, until now, been regarded as a timeless tale.

But that was before America became an intellectual backwater that would deem it necessary to censor its most celebrated author.

The one consolation is that somewhere, Mark Twain is laughing his head off.

Thursday, January 6, 2011

One more year


"I am committed to earning my degree in architectural design from Stanford University and am on track to accomplish this at the completion of the spring quarter of 2012."
--Andrew Luck

Thus did the consensus first pick for the 2011 NFL draft give up an immediate guarantee estimated at up to $60 million today, choosing instead to stay in college, get his degree, and play one more year of college football.

The message boards have been blazing all afternoon. A few commentators congratulated Luck on his decision. Many found his decision incomprehensible -- "you go to college to gain a skill so you can make money, and you're risking a fortune. If you really love architecture, why not grab the money and go back to school later?" "Doesn't Stanford teach you basic economics?"

And a large number of even more rabid commentators foamed at the mouth over a matter that had virtually no impact on their own lives. They appeared outraged by a decision that they sensed questioned their most hallowed belief -- that making a pile of bucks was the essence of life itself. They attacked Luck's intelligence, his father's wisdom, his coach's advice. One messsage board writer even said he hoped Luck would sustain an injury next year, to prove the foolishness of today's decision.

Luck had promised to carefully consider all the pros and cons of his choice. The money involved, the uncertainty over the future of the NFL's compensation system, and his probable draft by the Carolina Panthers -- each undoubtedly played a part in his decision-making. But his father, himself a university athletic director and former NFL quarterback, undoubtedly had the best insight into Andrew Luck's final decision:

Luck’s father, Oliver, said his son wanted to complete his degree in architectural design, a rigorous major in the college of engineering. Luck also felt, his father said, the tug of finishing his career with the players whom he entered school with.

“He wants to finish with those guys,” Oliver Luck said in a phone interview. “It’s a great group of players. That was by far the most important factor.”

Oliver Luck was listening to radio hosts criticize the decision on the radio Thursday and recalled the psychological test in which people perceive different things in inkblots.

“It’s a Rorschach test for people’s values system,” he said of the decision.
--New York Times (Pete Thamel)

No one would have criticized Luck if he had turned pro. But his critics are evaluating his decision as a business decision by a business. Instead, it was a life decision by a young man setting out in life. The choice he finally made reflects credit on himself, on his family, on his team and teammates, and on his school.