Creationism and Its Critics in Antiquity

Creationism and Its Critics in Antiquity by David Sedley. University of California Press, 2009, paper.

 

The world seems to be configured in ways hospitable to life forms. Is this the outcome of divine planning or an accident? Creationism — the argument that the world’s structure and contents can be adequately explained only by postulating an intelligent designer, creator god, was favored by the classical world. Greek and Roman creationists differed amongst themselves over the mechanisms that the creator used, or at least had at his disposal. Critics of these Ancient creationists argued that the origins of life could be explained as the outcome of chance occurrences or accidents, made possible by the infinity of space and matter.

 

David Sedley cautions the reader that this debate is unlike the present-day dispute between creationists and scientists. Fundamentalists base their beliefs about the origins of the world on the several creation stories in ancient Hebrew texts; scientists on deductions from their accumulated observations of nature. Neither view was held by ancient creationists nor their critics.

 

Sedley doesn’t discuss the creationist stories told by the ancient Israelites. They differed from the Greek in numerous ways. For example, the Israelite tradition has Yahweh creating the world out of nothing. The Greek metaphor for the creator god was that of the craftsman, or sculptor, fashioning the world out of existing space and matter.

 

Classical creationists most often held an anthropocentric view of creation. A benevolent god had created plants and “lower’ animals to serve the needs of humans, an idea not too unlike the Hebrew tradition of Yahweh giving humans “dominion” over His creation. Plato, for example, talks about the creator’s goodness. Their critics asked: How, then, does one explain aspects of this created world that were not beneficial to mankind? Man-eating-beasts, volcanoes, earthquakes?

 

The origins of natural phenomena were not the focus of creationist speculation. Rather they thought in terms of ends or goals, intentions or purposes.

 

There were, Plato believed, several episodes of creation. First a divine creator or demiurge fashioned the material world. And that materiality was subsequently infused with a world soul. Did Plato actually believe in the mythological artifice he invoked, the author asks? Sedley compares Plato’s use of myth to John Locke’s social contract; social and moral relations are fundamentally contractual. But Locke did not believe that there was an actual written or verbal contract in the historical past. True also of Plato’s creationism?

 

Classical notions of creation were imbedded in a rich philosophical tradition. For example, the Greeks were fond of the idea of symmetry. That raised the question, could the cosmos have had a beginning but no end? Restoring that symmetry meant that there could be no beginning, no time when there wasn’t space or matter. Or, one could hypothesize, no end, which for the Ancients, was also a problem.

 

The Greeks and Romans never came up with the idea of a creator god who then left his creation to run by itself. This is a much younger metaphor. A divine craftsman fashioned a clock-like world, wound its spring, and subsequently had no power -or chose not – to interfere in the working out of his creation.

 

The classical world also never came up with the idea that one species evolved out of another, a key notion in evolutionary biology in the nineteenth century and since. However, some Epicureans came close to the idea of adaptation through natural selection that would involve extinctions and survivals.

 

Creationists had difficulty explaining why it was that the rich fossil record of the Mediterranean world revealed creatures no longer present in their age. Why would a divine creator have created anything that was so imperfect that it perished? Or were these creatures still to be found in some corner of the earth?

 

David Sedley has woven together various ideas and movements from Greek cosmology and natural history that demonstrate a diversity of views over several centuries. It was a genteel discourse. Not as much was at stake as would be the case in the Christian west in a later age.

Flags of Our Fathers

Flags of Our Fathers by James Bradley with Ron Powers. Bantam Books, 2006, paper.

 

Flags of Our Fathers is a son’s commemoration of a veteran of the Pacific War. Published in 2000, it was the basis for a film released in 2006. The elder Bradley had died a few years previous to the book’s publication. In clearing out a closet, his family discovered boxes of letters and photographs that told the story of Bradley’s war experiences.

 

During World War II, John Bradley served in the Navy Medical Corps attached to a Marine company. He was part of the first wave of an amphibious force of 80,000 men that landed on the Japanese island of Iwo Jima in February 1945. This year was its 70th anniversary.

 

Capturing the island was important to our strategy of defeating Japan. Japanese planes, based on Iwo, were menacing our bombers on their way of Japanese cities. The island had 22,000 fanatic Japanese defenders who would fight to their death. Mostly young recruits, they had constructed a network of pillboxes and underground bunkers that made them almost impossible to see, and kill.

 

On the fourth day of the battle, a company commander sent a platoon to secure the top of Mount Suribachi, the highest point on the island. A flag was taken to the top and unfurled so that the Americans fighting below could have some assurance that victory was theirs, even though they had a month of deadly combat still ahead.

 

The company commander, thinking that the flag should become a battle souvenir, sent up a replacement flag with four marines who were laying a telephone line to an observation post at the top of Suribachi. They were joined by two more marines in hoisting the replacement flag, one of them the author’s father.

 

This second flag raising was caught by an Associated Press photographer, Joe Rosenthal. His image was transmitted by wire to the U.S. Two days later, the photograph, the raising of the American flag over Iwo Jima, appeared on the front page of most Sunday morning newspapers. It became an instant success and then an American icon.

 

Three of the men who had raised the flag died before the battle was over. John Bradley was wounded and in a military hospital when those in the photograph were identified. He and the two surviving marines were flown back to the States to be the focus of a war bond drive in the spring of 1945.

Bradley found this celebrity status a mockery of both the brutality and heroism of battle. His two comrades also found the experience of a transition from battlefield to instant celebrity status shattering. They had left their comrades to finish their battle. One of his marine comrades, Ira Hayes, a Native American, never recovered. He took to the bottle and died a tragic death in 1955.

 

You are told enough about these six marines to understand much about their lives and families: their common experience as kids growing up in depression America; their marine training, which had transformed them collectively into efficient killing machines; the horror of what they saw and experienced; the death of many of their companions the difficult return to civilian life after the shock of battle; the nightmares and weeping. To cope with those phantoms, John Bradley found it necessary to put out of his mind the most vivid experience of his life.

 

James Bradley found out that his father was in the Iwo Jima photograph from a third-grade teacher. When he asked his dad about his being a hero, the elder Bradley insisted that the picture had nothing to do with heroism and that the real heroes were the men who never returned. The subject was dropped and never again raised in his presence.

 

Flags of Our Fathers received much praise from readers and reviewers; few books have captured the complexity and furor of war and its aftermath as well. It reminds us that the memory of the Pacific war will soon no longer be in the possession of the living. The ‘greatest generation’ will no longer be around to tell their story.

 

Over Here; How the G.I. Bill Transformed the American Dream

Over Here; How the G.I. Bill Transformed the American Dream by Edward Humes.  Harcourt, 2006.

 

Edward Humes puts the Servicemen’s Readjustment Act (1944) in the same class of “transformative legislative achievements” as the Morrill Land-Grant College Act (1862) and the Civil Rights Act (1964). Commonly known as the G.I. Bill, it granted benefits to sixteen million returning servicemen and women. Benefits included hospitalization, unemployment benefits, free higher education and technical training, and low interest loans on houses with no down payments. Though Humes contends that few in 1944 understood its transforming character.

 

The G.I. Bill began as part of a third wave of Roosevelt’s New Deal legislation. Veteran’s organization in the country, including the American Legion supported benefits. Most Americans were wanting to show their gratitude to those who had fought the good war. Congress was looking for a means of ensuring that returning servicemen readjust to civilian life. They would become voters.

 

Servicemen returning from World War I had not been treated so well. After much agitation by veterans, the Coolidge Administration finally agreed to a bonus. But it was in the form of government bonds worth $7000 (2006 dollars) and not redeemable until 1944. A protest by veterans in Washington in 1932 (the Bonus Army) had been broken up by the army under orders from President Herbert Hoover and led by General Douglas Macarthur. It was a public relations disaster for the Hoover Administration. Everyone wanted to avoid a repeat of that spectacle.

Despite Roosevelt’s intentions, the G.I. Bill was not colorblind.

Representative John Rankin, Mississippi, and head of the House committee on veteran’s legislation understood that benefits given to returning black soldiers would transform the South’s racial politics. Initially hoping to block the bill, Rankin managed to get a provision which turned the awarding of benefits over to local and state officials.

 

In this and other ways the nation’s generosity did not extend equally to black Americans. Nor to the millions of defense workers, many of them women, who had also interrupted their lives to fight the good war.

 

Most universities and colleges welcomed returning veterans to their classrooms, emptied in the ’40s by the military draft. Though Humes points out that some patronizing university educators were “concerned” about how returning veterans would fare in their elitist institutions. Many would be the first of their family to enter college; some were high school dropouts.

 

It turns out that G.I.s were ‘over achievers.’ They were serious about wanting a college education and then getting on with their lives. In 1956, six years after most G.I.s had graduated and gone off to jobs and families, Iowa State University faculty were still commenting on what good students they had been.

 

The G.I. Bill also altered the landscape of urban America. The no-down­payment and low-interest mortgages created a huge demand for modest, “starter homes.” Since there had been little housing built during the Depression and the War, the Bill was welcomed by the housing industry. There was, however, little in the way of urban planning, and most of the new housing was mass-produced and built in the suburbs and largely for whites veterans.

 

Housing was a lead sector with a huge linkage effect to the furniture, appliance, and houseware industries. Detroit loved the new drive to the suburbs. In other words housing was vital to the American economy’s survival with the end of war production.

 

The G.I. bill created a class of capitalists who owned a house and hence an asset. They became part of the ‘ownership society.’ Perhaps ‘ownership’ was one reason for veterans tending to become more conservative as years passed. Humes notes that World War II veterans often opposed similar benefits for the veterans of the Vietnam War. The “greatest generation” having benefited from big-government programs, ultimately opposed big government.

 

 

American Chestnut

American Chestnut; The Life, Death, and Rebirth of a Perfect Tree by Susan Freinkel. University of California Press, 2009, paper.

Back in 1904 an urban forester noticed that American chestnuts

(Casatunea dentata) in the New York Zoological Park, now the Bronx Zoo, were dying of a blight caused by a fungus. The fungus would soon spread to other chestnut trees in the city, then to the forests of the eastern United States and Appalachia. By the end of the sad story that Susan Freinkel tells, billions of trees were dead and the American chestnut was all but wiped out everywhere in its range.

 

The American chestnut had been an important species in its ecosystem, dominating the upper canopy. Never the preferred wood for the furniture and building industries, it was ‘second best’ but a good source of tannin. Its ample nut harvest was one of fall’s pleasures; we sang about “chestnuts roasting on an open fire.”

 

Foresters were divided about how to respond to the blight. Some believed that it was hopeless to fight the fungus. Might as well harvest the trees, dead and alive. There was one brief effort to stop the blight, financed by the state of Pennsylvania, to no effect.

 

The blight fungus kills the trunks and branches of the tree; it doesn’t destroy the tree’s resilient root system. New growth sprouting from its roots will continue for years, only to be killed off by the fungus before the young trees have matured enough to reproduce. Eventually the root system wears out and the tree dies.

 

The U.S. Forest Service’s advice to take down even healthy specimens complicated, Freinkel contends, efforts to find disease-resistant trees. Individual trees with a slightly different genetic makeup might have survived and propagated. Occasionally even groves of chestnuts did survive, mostly outside the normal range of the tree.

 

There were two approaches to fighting the demise of the species; fix the tree so that it couldn’t be hurt by the fungus, or fix the fungus so that it couldn’t damage the tree. The European chestnuts were saved by the second strategy. A virus was found that sapped the fungus’s virulence. That approach was also tried here with little success. That because there are different strains of the fungus and because the fungus can develop immunity to any virus, breeders had to contend with an ever-changing target.

 

The more successful strategy in this country has been to cross the breed with the resistant Chinese chestnut. But Chinese chestnuts are short understory trees not capable of dominating the upper story of American forests. Worried about its form, American breeders backcrossed the hybrid Chinese/American Chestnut with a surviving C. dentata, and repeated the process until they had a tree that retained the resistant trait of the Chinese chestnut but now has the form of the American tree.

 

Freinkel reminds us, however, that there will be problems when reintroducing the ‘tame’ chestnut into the ‘wild.’ Our forests have had one hundred years to adjust to the demise of the chestnut. Oaks and poplars now dominate their upper canopy.

 

Freinkel also discusses an option that biotechnologists have proposed. They would alter the genetic structure of the tree by inserting a gene into chestnut embryos that will increase its resistance enough for it to survive. That scares a lot of folks, however, just like genetically-modified food crops do. A genetically modified tree would essentially be a new species, posing the problems of invasive species and genetic drift. Biogeneticists assure us that Mother Nature usually takes care of the first problem, and they deny the second.

 

Then there is the issue of the ‘wild’ to which eastern forests should be returned. A fire tolerant species, the American chestnut’s dominance was likely the result of Native Americans’ use of fire to maintain their woods.

 

The book mourns the loss of this wonderful tree. It is also an absorbing account of our continuing interaction with a much damaged forest environment.

 

The Economy of Prestige

The Economy of Prestige; Prizes, Awards, and the Circulation of Cultural Value by James English. Harvard University Press, 2008, paper.

This is a fascinating look at prizes, awards, and festivals in the arts and letters. James English discusses mostly prizes in literature, music, and film.  He argues that the awards ‘economy’ is a system of non-monetary symbolic transactions, enabling the artist or author and the institutions that award and receive these prizes to engage in a collective project of value production.

English mostly defends the system. Prizes such as the Oscar, the Nobel, the Pulitzer, National Book Awards, and the Man Booker, uphold a long tradition valuing and honoring creative individuals.  But it is true that the award ceremonies are often farce, circus, embarrassment. All those fancy gowns worn by starlets hoping to be seen. The crass commercialism of the ceremony.

There are precedents from ancient Greece and eighteenth-century Europe, but the establishment of the Nobel Prize in Literature in 1901 is the beginning of the contemporary economy of prestige. It remains the bench-mark of all cultural prizes. Alfred Nobel’s prize was duplicated by the Priz Concourt in France in 1903 and the Pulitzer Prize a few years later in the U.S. And so on. And on. Prizes, awards, festivals, biennales, halls of fame, etc. are now literally too numerous to count.

Yet this field of cultural prizes is not overgrown, English argues. In fact, opportunities or gaps for new prizes are continually being opened. For example, various minorities or new cultural movements have established prizes because they did not feel represented by the existing field of cultural awards.

English contends that this prize economy has been a mixed blessing to the arts and letters in the emerging, post-colonial countries. Generally their films and novels have received international prizes as a result of their popularity in the old imperial metropolises. But are these prize-winners the best representatives of the arts from the former colonies? The Nobel Prize given to Wole Soyinka in 1986 was controversial in his Nigeria. His Euro-modernism was one of several directions in which Nigerian literature was moving. True, Africa had long been neglected by the Swedish Academy. But why Soyinka, other than his popularity outside of Nigeria.

The administrative costs of most cultural prizes far exceed the cash prize itself. Administrators have an immense job. They must select a final few from the many, many nominations. English estimates that 98% of the selection process is done before the publicity acknowledged judges and juries commence their work. Professional readers, sometimes even the office staff, “prejudge,”- decide – who makes the final cut. Among other criteria, they must look at whether potential recipients are well enough known to give the prize credibility.

The economic capital generated by the awards depend upon the prestige of the judges as well. They have to be well-known and respected in their field.

Occasionally a recipient decides that it is beneath his or her cultural status or against his principles to accept a prize. English talks about a strategy of condescension that allows those who reject prizes nevertheless “to remain in play.” Julie Andrews, who refused a Tony Award in 1996, is said to have staged that refusal so as to publicize her then current Broadway show. Perhaps the judges were not worthy of judging her talent.

Andrews can afford to refuse a prize or two. She has many under her belt. Michael Jackson had won no fewer than 240 awards before he died.

Ironies abound. Most commonly the prize winners are already well known. They are acknowledged as having indisputable merit, hence in little need of this symbolic transaction. In fact the prize can gain cultural capital for the awarding organization, normally a non-profit, from the well-known recipient, rather than the other way around.

Tony Morrison is an avid collector of prizes. She thought she was going to collect a National Book Award in 1987 for her highly praised novel, Beloved. Instead the judges awarded it to Larry Heinemann, a relatively unknown author, for his novel, Paco’s Story, about a Vietnam veteran. A scandal ensued. Morrison’s friends took out ads in The New York Times demanding that she receive a compensatory Pulitzer Prize, which she did the next year. With some help from her friends.

Many thought that this quest for a prize on Morrison’s behalf was unseemly. The rules of the game are that you should appear to be indifferent to prizes, not really needing their validation.

English is taken by the ability of the metaphor of economic behavior to illuminate the prize economy and its consumers. He makes it work.

First Nights; Five Musical Premiers

First Nights; Five Musical Premiers by Thomas Kelly. Yale University Press, 2001 Paper.

First Nights explores how five famous musical compositions were heard and experienced at their premiers. Thomas Kelly describes the musical culture of the time, the techniques and instruments available, and the concert halls. Modern instruments and orchestration now provide opportunities to performers of these pieces that were not then available. Kelly believes, however, that something of the composer’s creation has been lost.

The premier of Claudio Monteverdi’s opera, L’Orfeo, in 1607 seems part of a distant past but also a familiar present. Monteverdi created L’Orfeo out of the materials at hand in Mantua, then a wealthy town of 40,000 in northern Italy. “A familiar present” because Kelly’s account of the performance resembles the mounting of a local concert in many a medium-sized American city.

George Frederick Handel first performed his Messiah in Dublin in 1742. Dublin was then the second city of the British Empire and building its musical culture. Handel’s beloved oratorio had many features of eighteenth-century opera. But Kelly reminds us that his innovative use of solo voices, chorus, and orchestra took an important step away from opera. The libretto was drawn from the Hebrew and Christian Bibles and the Anglican Prayer Book, rather than Greek mythology. It comes closer to theology than storytelling.

Ludwig Beethoven’s career spanned the years of the French Revolution and Empire. Prior to this cultural upheaval, his patrons and audience had been the landed and administrative elites of the Austrian Empire. By 1824, the year that he premiered the Ninth Symphony, his Viennese audience was mostly bourgeois. Beethoven was now free of the constraints of aristocratic patronage. But that also meant that he had to involve himself more fully in the business of music to make his living

The symphony as a musical form was, previous to Beethoven, shorter and less ambitious. The first movements of Mozart’s and Haydn’s symphonies had been the most important and longest, followed by three shorter pieces. Symphonies were generally performed first, not last, in the evening’s musical program with other works often inserted between the movements. The premier of the Ninth was the first occasion when voice was incorporated into the final movement. The performance of the Ninth confirmed Beethoven’s transformation of the symphonic form over his musical career.

The premier of Hector Berloiz’s Symphonie Fantastique in 1830 was only six years after Beethoven’s Ninth but the differences between the composers and their compositions are large. Berlioz was only 27. He had just won a major prize for composition. That, combined with his own hustle, resulted in his being able to draw talents from prestigious musical organizations in Paris for his premier performance. Berlioz resembles young celebrity performers of our own day. And this premier performance of Symphonie Fantastique would, in some ways, be recognizable to contemporary concert goers. In other ways, it seems like yesteryear.

Kelly’s description of the premier of Igor Stravinsky’s ballet score, Le sacre du printemps is unique to a specific cultural time, Paris, 1913. Stravinsky’s fellow Russians were coloring the Parisian cultural world. Their success centered on the Russian impresario, Sergy Diaghilev and the presence in Paris of many Russian composers, choreographers, and dancers. What went wrong at this first night has been much discussed. Vaslav Nijinsky’s choreography was controversial, moving Russian dance away from classical ballet gestures and movements. And Stravinsky was carrying orchestral music well along the way toward twentieth-century dissidence. Consequently on this first night the audience responded with boos, cat calls, and laughter, at times drowning out the orchestra.

 

Audiences have changed since that year, just over a century ago. The bourgeoisie audiences that dominate performances of classical music in the 21st century might at first hearing disregard the new composition, but our response would be polite if unenthusiastic applause. In a sense the audience has been erased, far different from the pro-active audiences of Stravinsky’s Paris.

This Republic of Suffering; Death and the American Civil War

Drew Faust. This Republic of Suffering; Death and the American Civil War by Drew Faust. Vintage, 2009 paper.

Drew Faust describes the ways in which death, on the scale that it was experienced in the Civil War, altered attitudes on an array of issues related to the end of life. These war casualties seemed unnatural to nineteenth-century Americans. The war killed young men in huge numbers, then the healthiest demographic and the least expected to die so young.

War deaths violated Victorian and Christian views on the Good Death. One expected to die at home, surrounded by loving family gathered around the death bed. There were last words; the dying person was fully cognizant of his impending death. All attending hoped that he or she would have an easy death. Generally some religious assurance was at hand.

Instead the bodies of the Civil War dead were scattered where they fell, often in pieces, amongst dead horses and mules. Unless quickly buried, they began to decompose. The dying were far from home and family and unattended in the final moments of life. It was a stranger’s death, in a strange land.

A good portion of the deaths were not battlefield casualties. Rather, soldiers died in field hospitals from their wounds, from gangrene, diarrhea, dysentery, and typhoid fever. Civil War prisoners were led off to an uncertain future in camps like Andersonville, Georgia. Nine percent of the Civil War dead died in prisoner-of-war camps.

Faust explains how doctors and nurses, commanding officers, and fellow soldiers attempted to restore some of the assurances of the Good Death. Letters of condolence were often written by someone who had witnessed the dying moments. If that weren’t possible, the family could be assured that their loved one had died for his country. “I have never witnessed such an exhibition of fortitude and Christian resignation as [your son, brother, husband] showed” would be a likely part of any letter. Some keepsake, perhaps a watch, diary, Bible, lock of hair, might be included in the condolence letter.

What circumstances of warfare in the 1860s shaped the ”republic of suffering?” While Civil War weaponry was more deadly than in previous wars, you were often still close enough to see the person that you were killing. Moreover most combatants were not professional soldiers. Slaying someone, even in warfare, was, for many, a transgression of the Sixth Commandment. As in most armies, a minority did most of the fighting, the rest avoided active combat.

Some individuals, however, acquired a taste for killing. Sharp shooters were particularly likely to do so. They generally killed at close range but hidden from view. When captured they were often shot. Black troops fighting in Northern armies were usually given no quarter.

Civil War soldiers did not have any official identification — no dog tags. They might carry on them a letter of identity with instructions about how to notify their next-of-kin. Neither army, at first, took identification of the dead seriously. Bodies were collected together, having been dragged across rough ground, and buried in mass graves. Thus a large portion of the enormous numbers of dead were never named, but simply declared missing in action. Grieving families would often travel to the battlefields to undertake a generally hopeless search for their loved one.

 

Had he died a ‘natural’ death, a person would have been buried amongst the cluster of his family graves in a churchyard or one of the municipal “garden” cemeteries being laid out in the north. To handle the numbers of dead and still provide some semblance of respect, national cemeteries were established alongside the great battlefields. At Gettysburg,for example. Their ordered rows of identical markers attest to the costs of the War, but also the anonymity of death.

The individual states, later and with great solemnity, erected monuments that now give character to these mass burial sites. Often the states would declare a day of remembrance that eventually coalesced around a national holiday, Memorial Day.

Drew Faust has taken an important piece of American history, previously overlooked or misunderstood, and given it focus and meaning. It won a National Book Award nomination. It is one of my favorite books

Zounds! A Browser’s Dictionary of Interjections

Zounds! A Browser’s Dictionary of Interjections by Mark Dunn. St. Martins Press, 2005, paper.

 

Mark Dunn has given up on defining an interjection. Some grammarians contend that interjections lie outside the rules of grammar; i.e. that they occupy a linguistic fringe. Other grammarians admit that they are often goofy, rarely literal, but nevertheless part of grammar. Zounds! lists over 750 of them. And Mark Dunn admits that this is only a sampling.

 

Their uses are so varied that creating categories is difficult. Shunning the use of profanity – swear words – but at the same time wanting to make a succinct comment about the matter at hand is a common intent. Darnnation! and gol-durn it!, two of many variations, are stand-ins for the `D’ word. The list of substitutes for swear words is endless. Tarnation! as in, “What in tarnation are you up to?” For Pete’s sake! Pete is thought to reference the apostle, Peter. Gosh! Or even more explicit, gosh-all-mighty! instead of the ‘G’ word. Gee! and gee whiz! and Jiminy Cricket! instead of ‘J-C.’ There are many interjections that mock piety. Holy guacamole! Holy cow! Holy moly!

 

Many interjections have emerged out of popular entertainment. Jiminy Cricket! was created and sneaked into a Walt Disney movie by its writers. Yadda-yadda! warns the listener that the speaker has chosen to make a quick cut to the finish of a story or argument, or hopes that you will. It is from Jerry Steinfeld’s show. As is Giddouttahere! an interjection to use when you discover someone playing around with your gullibility.

 

Popular interjections come and go. Many of Dunn’s list sound like grandma talking. Fiddle-dee-dee! and for crying out loud! Grandma would never have allowed the ‘bad words’ for which these are interjections or stand-ins to be uttered in her presence. If she were really strict about these matters, she might not even have allowed these substitutes to be used within her earshot. The current O.M.G., Oh My God! would never have been approved by Grandma. Except when recited in the liturgy.

 

Interjections are spoken but rarely seen in print. So Dunn’s spellings are useful. The Random House Unabridged Dictionary contains many of Dunn’s interjections and its definitions are better. But first you have to figure out the spelling from a remembered oral tradition. Not so easy.

 

Interjections commonly function as one- or two-word commentaries. They are often impertinent, dismissive, sarcastic, saturated with attitude. Well, boo hoo! is an insincere, mocking lamentation. Duh! is a commentary on the penetrating intelligence of some observation or realization. Whatever! a contemptuous dismissal indicating a lack of interest. Phooey! is an older dismissive.

 

Excuuuuuse me! is a response to a rude question or thrust. When drawn out, the word “excuse” takes on a different coloration from the polite usage. Similarly puh-leeze! means something very different from “please.” Interjections can of course be affirmative. Awesome! Cool! Groovy! Right On!

 

Some interjections are centuries old, some decades, some just the last few years. Some seem so trite as to deserve their forthcoming oblivion. No problem! one that I would like to see fade, despite its Latin provenance. Enjoy! is overused by waiters; let it die. Have a good one! is powerfully off-putting.

 

But for every interjection that I hope will fade, there are a dozen that I wish would stick around. The secularized Amen! is handy to signify agreement. Boy, oh boy! is fun but threatened because it is gender specific. For crying out loud! Heck! I’ll say! La-di-da! Lord a mercy! Okey-dokey! Voilà! What the Same Hill! Our language would be the poorer without its many interjections.

 

I gave a review of Zounds! at one of the retirement communities in town. After the above introduction, I asked them to come up with interjections that they remember – even use. They came up with a list of over 120 that weren’t mentioned above, nor by Mark Dunn. Awesome!  A partial list: Baa humbug! Bless your little pee-picken heart! Dad-burn it! For crying out loud! Groovy! Holy cow! Is the Pope, Catholic? My my! Oh my stars and garters! Phooey! Say that again! Take a fly’n leap! You bet your sweet baby!

 

Warning! Don’t try these out on a crowd that is less than say 60. You’ll get a lot of blank faces, you betcha!

The Discovery of France; A Historical Geography

The Discovery of France; A Historical Geography from the Revolution to the First World War by Graham Robb. W.W. Norton, 2008, paper.

 

Graham Robb argues that in the year of revolution, 1789, France, the most populous country in Europe, was a “terra incognito.” Few Parisians knew anything about the France outside of the Paris basin. Until the advent of the bicycle, the known universe for most provincials was a radius of fifteen miles from their home.

 

The French language was still not understood by the southern half of France where the Occitan languages dominated the countryside. There were and remain to this day important regional languages — Breton, Catalan, Flemish, Provençal — that have survived French nation-building and high-speed travel.

 

The provinces were thinly inhabited except for parts of the Rhone valley, the Rhineland, Flanders, and the English Channel coast. Robb speaks of these sparse, insular rural populations as tribes and clans, with their own patois. These “tribes” often claimed to have had their own “histories.” Many liked to think that their village and its population had a Roman ancestory, the result of the Roman conquest of Gaul. Or they were Normans, remnants of Viking invaders. Villagers were self-sufficient, poor, and living on the brink of hard times at best. The “painted peasantry” of the artists’ shows none of their infirmities.

 

Bourgeoisie observers of the country and its inhabitants engaged in what Robb calls “moral mapping.” They commented on the stupidity and impudence of the peasantry. Their dumb silence, was of course, partly because they could not understand urban French speakers. Frenchmen became smarter the closer their domicile was to the urban, educated elites.

 

Much of this isolation of rural France was the result of the time it took to travel distances. That would change only with the introduction of faster modes of transportation, first the bicycle, some water transport, a better roads, and finally the railroads. Napoleon began working on a road network that benefited travelers who could afford to take the public stage coaches. Most Frenchmen could not afford speed.

 

Having described how few Frenchmen traveled and hence knew anything about the world beyond their village or town, Robb then enumerates various populations of migrants and commuters. The most celebrated is the transhumant migrations of sheep and goats and their herders to summer pastures in  the mountains of eastern and southern France. Apprentice craftsmen acquiring local techniques of working with local materials by taking the Tour de France (as it was then called). There was a substantial movement of children to urban areas to become chimney sweeps, peddlers, beggars, and petty thieves. They would go back to spend time in their villages often returning to the cities with hand crafted goods from the rural areas to peddle.

 

There was also a substantial number of pilgrims discovering France. The great pilgrimage to the burial site and shrine of the apostle St. James at Santiago de Compostela in Spain originated in southwestern France. It generated something like a tourist industry to accommodate those ambulatory pilgrims. Another well-known example was the huge pilgrim travel associated with a visit to Lourdes and the site of Bernadette Soubirous’s visions.

 

There were also numerous local and roadside shrines all over provincial France that were sites of pilgrimage. Many were old Druid sites, sacred stones and other natural objects, attesting to the survival of pagan spirituality. The Catholic Church worked to suppress these shrines. By 1914 many of the stone shrines had been crushed and used to build roads.

 

Despite the slow pace of travel prior to the railroads, Robb claims that rumor could travel nine miles per hour. The most famous example is the ‘great fear’ that spread through the provinces in July and early August of 1789, the result of rumor that foreign troops and numerous bandits financed by vengeful aristocrats had

invaded the country. Graham Robb has little to say about the role of the telegraph and the revolution that it brought to the speed of communication in the nineteenth century.

 

Robb’s The Discovery of France is historical geography at its best. A cyclist, he invites the traveler/tourist to return to a slower speed to rediscover for themselves France in all of its diversity.

 

India; A Sacred Geography

India; A Sacred Geography by Diana Eck. Harmony, 2013, paper.

 

Mostly a thing of the past in Europe and North America, the pilgrimage tradition continues in India, animating a sacred geography several millennia in the mapping. Vārānasī (Kāshī, Benares) is the best known pilgrimage site in this sacred landscape. But this city on the banks of the Ganga (Ganges) is just one of the sacred places that Diana Eck locates and describes in her fabulous book.

 

There are, in fact, seven sacred cities: Vārānasī, Ayodhyā, Mathurā, Hardvār, Kāchīpurim, Ujjain, and Dvāraka. Each has associations with particular gods in the Indian pantheon. All have substantial temple complexes and are sites of pilgrimage, though not necessarily tourist destinations or architecturally interesting. I have toured Vārānasī, Mathurā, and Kāchīpurim and looked carefully at their temples.

 

There are also lists of sacred mountains, forests, and particularly rivers linked to elaborate stories of Hindu gods and heroes. Eck describes the Indian notion of tīrthas, fords or crossing places of rivers. These tīrthas have a spiritual significance. Bathing at these locations affords the devotee liberation from the cycle of birth and death and the attainment of nirvāna.

 

Rivers are personified as goddesses. Especially sacred are the headwaters of these rivers and the sites of their convergence. The Narmadā, perhaps India’s most revered river, originates in the Indian state of Madhya Pradesh and flows west to Gujarat and the Arabian Sea. The pilgrimage route, along both banks of the river, involves an eighteen-hundred-mile trek. The pilgrims are much more numerous at the annual Māgha Melā at Allahabad where the Ganga and Yumunā converge. Crowds: we’re talking about millions.

 

Indian gods are well-traveled and have thus created trails passing through the landscape and hence many sacred sites. Eck points out that Hindu religious life has never had a hierarchy: no popes, bishops, etc. Hence there has never been a sorting out of the miscellany of these narratives about the gods and their habitats. She uses phrases like “is said to be,” “the tale is told,” “so they say,” and “according to tradition” to indicate their tentativeness.

 

Mention of the great Mughal city of Allahabad above brings to mind the fact that the Indian subcontinent is also home to other religions. Jains, Sikhs, Christians, Buddhists, and Moslems have shared this sacred landscape with Hindus. Moslems were initially temple destroyers; their most spectacular vandalism was the vast temple complex at Somnāth on the Kathiawar Peninsula in the eleventh century. This Moslem destruction of temples which were the focus of pilgrimage disrupted for centuries temple-based piety, patronage, building and repair. But during the Mughal period in the seventeenth and eighteenth centuries the Empire and its client Hindu kingdoms proved to be great patrons of temple building and reconstruction.

 

The reverse happened recently when Hindu chauvinists destroyed a mosque at Ayodhyā which was said to have been built on the birth place of Lord Rāma. Though most Hindus are ecumenical. For example, they revere the tombs of Sufi saints scattered throughout northern India and many Buddhist shrines, as well.

 

Like Islam and Christianity, there are sects within Hinduism. They get along better than have the various Christian churches, however, and are able to share pilgrimage sites. Shivites most commonly return to the circuit of temples associated with devotion to Shiva.  And likewise Vaishnavite pilgrims generally head for the temples associated Vishnu. Kānchipuram, a temple complex in Tamil Nadu, associated with the Shivites and the Jagannātha temple at Puri in Orissa associated with Vaishnavites are, however, sites of pilgrimage for both sects. They are also examples of striking temple architecture.

 

In my travels around India over three decades, I tended to avoid crowds and therefore a landscape filled with pilgrims. The one exception was attending the Krishna Janmāshtamī, a midnight darshan or auspicious presence of the god at his birthplace near Mathurā, south of Delhi. The temple which contained the image of Krishna was rebuilt in the nineteenth century and is not architecturally inspiring. One gets the impression that a good part of the religious atmosphere of the darshan is the feeling of being a part of a vast gathering.

 

Indian nationalists in the last century claimed that they discovered ‘Mother India,’ a unity from the Himalayas to Tamil Nadu. Eck reminds us that, in fact, these circuits of temples and pilgrimage sites have provided Indians with a singular religious landscape both inspiring and unifying, and centuries before their struggle for independence.