Creationism and Its Critics in Antiquity

Creationism and Its Critics in Antiquity by David Sedley. University of California Press, 2009, paper.

 

The world seems to be configured in ways hospitable to life forms. Is this the outcome of divine planning or an accident? Creationism — the argument that the world’s structure and contents can be adequately explained only by postulating an intelligent designer, creator god, was favored by the classical world. Greek and Roman creationists differed amongst themselves over the mechanisms that the creator used, or at least had at his disposal. Critics of these Ancient creationists argued that the origins of life could be explained as the outcome of chance occurrences or accidents, made possible by the infinity of space and matter.

 

David Sedley cautions the reader that this debate is unlike the present-day dispute between creationists and scientists. Fundamentalists base their beliefs about the origins of the world on the several creation stories in ancient Hebrew texts; scientists on deductions from their accumulated observations of nature. Neither view was held by ancient creationists nor their critics.

 

Sedley doesn’t discuss the creationist stories told by the ancient Israelites. They differed from the Greek in numerous ways. For example, the Israelite tradition has Yahweh creating the world out of nothing. The Greek metaphor for the creator god was that of the craftsman, or sculptor, fashioning the world out of existing space and matter.

 

Classical creationists most often held an anthropocentric view of creation. A benevolent god had created plants and “lower’ animals to serve the needs of humans, an idea not too unlike the Hebrew tradition of Yahweh giving humans “dominion” over His creation. Plato, for example, talks about the creator’s goodness. Their critics asked: How, then, does one explain aspects of this created world that were not beneficial to mankind? Man-eating-beasts, volcanoes, earthquakes?

 

The origins of natural phenomena were not the focus of creationist speculation. Rather they thought in terms of ends or goals, intentions or purposes.

 

There were, Plato believed, several episodes of creation. First a divine creator or demiurge fashioned the material world. And that materiality was subsequently infused with a world soul. Did Plato actually believe in the mythological artifice he invoked, the author asks? Sedley compares Plato’s use of myth to John Locke’s social contract; social and moral relations are fundamentally contractual. But Locke did not believe that there was an actual written or verbal contract in the historical past. True also of Plato’s creationism?

 

Classical notions of creation were imbedded in a rich philosophical tradition. For example, the Greeks were fond of the idea of symmetry. That raised the question, could the cosmos have had a beginning but no end? Restoring that symmetry meant that there could be no beginning, no time when there wasn’t space or matter. Or, one could hypothesize, no end, which for the Ancients, was also a problem.

 

The Greeks and Romans never came up with the idea of a creator god who then left his creation to run by itself. This is a much younger metaphor. A divine craftsman fashioned a clock-like world, wound its spring, and subsequently had no power -or chose not – to interfere in the working out of his creation.

 

The classical world also never came up with the idea that one species evolved out of another, a key notion in evolutionary biology in the nineteenth century and since. However, some Epicureans came close to the idea of adaptation through natural selection that would involve extinctions and survivals.

 

Creationists had difficulty explaining why it was that the rich fossil record of the Mediterranean world revealed creatures no longer present in their age. Why would a divine creator have created anything that was so imperfect that it perished? Or were these creatures still to be found in some corner of the earth?

 

David Sedley has woven together various ideas and movements from Greek cosmology and natural history that demonstrate a diversity of views over several centuries. It was a genteel discourse. Not as much was at stake as would be the case in the Christian west in a later age.

Flags of Our Fathers

Flags of Our Fathers by James Bradley with Ron Powers. Bantam Books, 2006, paper.

 

Flags of Our Fathers is a son’s commemoration of a veteran of the Pacific War. Published in 2000, it was the basis for a film released in 2006. The elder Bradley had died a few years previous to the book’s publication. In clearing out a closet, his family discovered boxes of letters and photographs that told the story of Bradley’s war experiences.

 

During World War II, John Bradley served in the Navy Medical Corps attached to a Marine company. He was part of the first wave of an amphibious force of 80,000 men that landed on the Japanese island of Iwo Jima in February 1945. This year was its 70th anniversary.

 

Capturing the island was important to our strategy of defeating Japan. Japanese planes, based on Iwo, were menacing our bombers on their way of Japanese cities. The island had 22,000 fanatic Japanese defenders who would fight to their death. Mostly young recruits, they had constructed a network of pillboxes and underground bunkers that made them almost impossible to see, and kill.

 

On the fourth day of the battle, a company commander sent a platoon to secure the top of Mount Suribachi, the highest point on the island. A flag was taken to the top and unfurled so that the Americans fighting below could have some assurance that victory was theirs, even though they had a month of deadly combat still ahead.

 

The company commander, thinking that the flag should become a battle souvenir, sent up a replacement flag with four marines who were laying a telephone line to an observation post at the top of Suribachi. They were joined by two more marines in hoisting the replacement flag, one of them the author’s father.

 

This second flag raising was caught by an Associated Press photographer, Joe Rosenthal. His image was transmitted by wire to the U.S. Two days later, the photograph, the raising of the American flag over Iwo Jima, appeared on the front page of most Sunday morning newspapers. It became an instant success and then an American icon.

 

Three of the men who had raised the flag died before the battle was over. John Bradley was wounded and in a military hospital when those in the photograph were identified. He and the two surviving marines were flown back to the States to be the focus of a war bond drive in the spring of 1945.

Bradley found this celebrity status a mockery of both the brutality and heroism of battle. His two comrades also found the experience of a transition from battlefield to instant celebrity status shattering. They had left their comrades to finish their battle. One of his marine comrades, Ira Hayes, a Native American, never recovered. He took to the bottle and died a tragic death in 1955.

 

You are told enough about these six marines to understand much about their lives and families: their common experience as kids growing up in depression America; their marine training, which had transformed them collectively into efficient killing machines; the horror of what they saw and experienced; the death of many of their companions the difficult return to civilian life after the shock of battle; the nightmares and weeping. To cope with those phantoms, John Bradley found it necessary to put out of his mind the most vivid experience of his life.

 

James Bradley found out that his father was in the Iwo Jima photograph from a third-grade teacher. When he asked his dad about his being a hero, the elder Bradley insisted that the picture had nothing to do with heroism and that the real heroes were the men who never returned. The subject was dropped and never again raised in his presence.

 

Flags of Our Fathers received much praise from readers and reviewers; few books have captured the complexity and furor of war and its aftermath as well. It reminds us that the memory of the Pacific war will soon no longer be in the possession of the living. The ‘greatest generation’ will no longer be around to tell their story.

 

Over Here; How the G.I. Bill Transformed the American Dream

Over Here; How the G.I. Bill Transformed the American Dream by Edward Humes.  Harcourt, 2006.

 

Edward Humes puts the Servicemen’s Readjustment Act (1944) in the same class of “transformative legislative achievements” as the Morrill Land-Grant College Act (1862) and the Civil Rights Act (1964). Commonly known as the G.I. Bill, it granted benefits to sixteen million returning servicemen and women. Benefits included hospitalization, unemployment benefits, free higher education and technical training, and low interest loans on houses with no down payments. Though Humes contends that few in 1944 understood its transforming character.

 

The G.I. Bill began as part of a third wave of Roosevelt’s New Deal legislation. Veteran’s organization in the country, including the American Legion supported benefits. Most Americans were wanting to show their gratitude to those who had fought the good war. Congress was looking for a means of ensuring that returning servicemen readjust to civilian life. They would become voters.

 

Servicemen returning from World War I had not been treated so well. After much agitation by veterans, the Coolidge Administration finally agreed to a bonus. But it was in the form of government bonds worth $7000 (2006 dollars) and not redeemable until 1944. A protest by veterans in Washington in 1932 (the Bonus Army) had been broken up by the army under orders from President Herbert Hoover and led by General Douglas Macarthur. It was a public relations disaster for the Hoover Administration. Everyone wanted to avoid a repeat of that spectacle.

Despite Roosevelt’s intentions, the G.I. Bill was not colorblind.

Representative John Rankin, Mississippi, and head of the House committee on veteran’s legislation understood that benefits given to returning black soldiers would transform the South’s racial politics. Initially hoping to block the bill, Rankin managed to get a provision which turned the awarding of benefits over to local and state officials.

 

In this and other ways the nation’s generosity did not extend equally to black Americans. Nor to the millions of defense workers, many of them women, who had also interrupted their lives to fight the good war.

 

Most universities and colleges welcomed returning veterans to their classrooms, emptied in the ’40s by the military draft. Though Humes points out that some patronizing university educators were “concerned” about how returning veterans would fare in their elitist institutions. Many would be the first of their family to enter college; some were high school dropouts.

 

It turns out that G.I.s were ‘over achievers.’ They were serious about wanting a college education and then getting on with their lives. In 1956, six years after most G.I.s had graduated and gone off to jobs and families, Iowa State University faculty were still commenting on what good students they had been.

 

The G.I. Bill also altered the landscape of urban America. The no-down­payment and low-interest mortgages created a huge demand for modest, “starter homes.” Since there had been little housing built during the Depression and the War, the Bill was welcomed by the housing industry. There was, however, little in the way of urban planning, and most of the new housing was mass-produced and built in the suburbs and largely for whites veterans.

 

Housing was a lead sector with a huge linkage effect to the furniture, appliance, and houseware industries. Detroit loved the new drive to the suburbs. In other words housing was vital to the American economy’s survival with the end of war production.

 

The G.I. bill created a class of capitalists who owned a house and hence an asset. They became part of the ‘ownership society.’ Perhaps ‘ownership’ was one reason for veterans tending to become more conservative as years passed. Humes notes that World War II veterans often opposed similar benefits for the veterans of the Vietnam War. The “greatest generation” having benefited from big-government programs, ultimately opposed big government.

 

 

American Chestnut

American Chestnut; The Life, Death, and Rebirth of a Perfect Tree by Susan Freinkel. University of California Press, 2009, paper.

Back in 1904 an urban forester noticed that American chestnuts

(Casatunea dentata) in the New York Zoological Park, now the Bronx Zoo, were dying of a blight caused by a fungus. The fungus would soon spread to other chestnut trees in the city, then to the forests of the eastern United States and Appalachia. By the end of the sad story that Susan Freinkel tells, billions of trees were dead and the American chestnut was all but wiped out everywhere in its range.

 

The American chestnut had been an important species in its ecosystem, dominating the upper canopy. Never the preferred wood for the furniture and building industries, it was ‘second best’ but a good source of tannin. Its ample nut harvest was one of fall’s pleasures; we sang about “chestnuts roasting on an open fire.”

 

Foresters were divided about how to respond to the blight. Some believed that it was hopeless to fight the fungus. Might as well harvest the trees, dead and alive. There was one brief effort to stop the blight, financed by the state of Pennsylvania, to no effect.

 

The blight fungus kills the trunks and branches of the tree; it doesn’t destroy the tree’s resilient root system. New growth sprouting from its roots will continue for years, only to be killed off by the fungus before the young trees have matured enough to reproduce. Eventually the root system wears out and the tree dies.

 

The U.S. Forest Service’s advice to take down even healthy specimens complicated, Freinkel contends, efforts to find disease-resistant trees. Individual trees with a slightly different genetic makeup might have survived and propagated. Occasionally even groves of chestnuts did survive, mostly outside the normal range of the tree.

 

There were two approaches to fighting the demise of the species; fix the tree so that it couldn’t be hurt by the fungus, or fix the fungus so that it couldn’t damage the tree. The European chestnuts were saved by the second strategy. A virus was found that sapped the fungus’s virulence. That approach was also tried here with little success. That because there are different strains of the fungus and because the fungus can develop immunity to any virus, breeders had to contend with an ever-changing target.

 

The more successful strategy in this country has been to cross the breed with the resistant Chinese chestnut. But Chinese chestnuts are short understory trees not capable of dominating the upper story of American forests. Worried about its form, American breeders backcrossed the hybrid Chinese/American Chestnut with a surviving C. dentata, and repeated the process until they had a tree that retained the resistant trait of the Chinese chestnut but now has the form of the American tree.

 

Freinkel reminds us, however, that there will be problems when reintroducing the ‘tame’ chestnut into the ‘wild.’ Our forests have had one hundred years to adjust to the demise of the chestnut. Oaks and poplars now dominate their upper canopy.

 

Freinkel also discusses an option that biotechnologists have proposed. They would alter the genetic structure of the tree by inserting a gene into chestnut embryos that will increase its resistance enough for it to survive. That scares a lot of folks, however, just like genetically-modified food crops do. A genetically modified tree would essentially be a new species, posing the problems of invasive species and genetic drift. Biogeneticists assure us that Mother Nature usually takes care of the first problem, and they deny the second.

 

Then there is the issue of the ‘wild’ to which eastern forests should be returned. A fire tolerant species, the American chestnut’s dominance was likely the result of Native Americans’ use of fire to maintain their woods.

 

The book mourns the loss of this wonderful tree. It is also an absorbing account of our continuing interaction with a much damaged forest environment.

 

The Economy of Prestige

The Economy of Prestige; Prizes, Awards, and the Circulation of Cultural Value by James English. Harvard University Press, 2008, paper.

This is a fascinating look at prizes, awards, and festivals in the arts and letters. James English discusses mostly prizes in literature, music, and film.  He argues that the awards ‘economy’ is a system of non-monetary symbolic transactions, enabling the artist or author and the institutions that award and receive these prizes to engage in a collective project of value production.

English mostly defends the system. Prizes such as the Oscar, the Nobel, the Pulitzer, National Book Awards, and the Man Booker, uphold a long tradition valuing and honoring creative individuals.  But it is true that the award ceremonies are often farce, circus, embarrassment. All those fancy gowns worn by starlets hoping to be seen. The crass commercialism of the ceremony.

There are precedents from ancient Greece and eighteenth-century Europe, but the establishment of the Nobel Prize in Literature in 1901 is the beginning of the contemporary economy of prestige. It remains the bench-mark of all cultural prizes. Alfred Nobel’s prize was duplicated by the Priz Concourt in France in 1903 and the Pulitzer Prize a few years later in the U.S. And so on. And on. Prizes, awards, festivals, biennales, halls of fame, etc. are now literally too numerous to count.

Yet this field of cultural prizes is not overgrown, English argues. In fact, opportunities or gaps for new prizes are continually being opened. For example, various minorities or new cultural movements have established prizes because they did not feel represented by the existing field of cultural awards.

English contends that this prize economy has been a mixed blessing to the arts and letters in the emerging, post-colonial countries. Generally their films and novels have received international prizes as a result of their popularity in the old imperial metropolises. But are these prize-winners the best representatives of the arts from the former colonies? The Nobel Prize given to Wole Soyinka in 1986 was controversial in his Nigeria. His Euro-modernism was one of several directions in which Nigerian literature was moving. True, Africa had long been neglected by the Swedish Academy. But why Soyinka, other than his popularity outside of Nigeria.

The administrative costs of most cultural prizes far exceed the cash prize itself. Administrators have an immense job. They must select a final few from the many, many nominations. English estimates that 98% of the selection process is done before the publicity acknowledged judges and juries commence their work. Professional readers, sometimes even the office staff, “prejudge,”- decide – who makes the final cut. Among other criteria, they must look at whether potential recipients are well enough known to give the prize credibility.

The economic capital generated by the awards depend upon the prestige of the judges as well. They have to be well-known and respected in their field.

Occasionally a recipient decides that it is beneath his or her cultural status or against his principles to accept a prize. English talks about a strategy of condescension that allows those who reject prizes nevertheless “to remain in play.” Julie Andrews, who refused a Tony Award in 1996, is said to have staged that refusal so as to publicize her then current Broadway show. Perhaps the judges were not worthy of judging her talent.

Andrews can afford to refuse a prize or two. She has many under her belt. Michael Jackson had won no fewer than 240 awards before he died.

Ironies abound. Most commonly the prize winners are already well known. They are acknowledged as having indisputable merit, hence in little need of this symbolic transaction. In fact the prize can gain cultural capital for the awarding organization, normally a non-profit, from the well-known recipient, rather than the other way around.

Tony Morrison is an avid collector of prizes. She thought she was going to collect a National Book Award in 1987 for her highly praised novel, Beloved. Instead the judges awarded it to Larry Heinemann, a relatively unknown author, for his novel, Paco’s Story, about a Vietnam veteran. A scandal ensued. Morrison’s friends took out ads in The New York Times demanding that she receive a compensatory Pulitzer Prize, which she did the next year. With some help from her friends.

Many thought that this quest for a prize on Morrison’s behalf was unseemly. The rules of the game are that you should appear to be indifferent to prizes, not really needing their validation.

English is taken by the ability of the metaphor of economic behavior to illuminate the prize economy and its consumers. He makes it work.

First Nights; Five Musical Premiers

First Nights; Five Musical Premiers by Thomas Kelly. Yale University Press, 2001 Paper.

First Nights explores how five famous musical compositions were heard and experienced at their premiers. Thomas Kelly describes the musical culture of the time, the techniques and instruments available, and the concert halls. Modern instruments and orchestration now provide opportunities to performers of these pieces that were not then available. Kelly believes, however, that something of the composer’s creation has been lost.

The premier of Claudio Monteverdi’s opera, L’Orfeo, in 1607 seems part of a distant past but also a familiar present. Monteverdi created L’Orfeo out of the materials at hand in Mantua, then a wealthy town of 40,000 in northern Italy. “A familiar present” because Kelly’s account of the performance resembles the mounting of a local concert in many a medium-sized American city.

George Frederick Handel first performed his Messiah in Dublin in 1742. Dublin was then the second city of the British Empire and building its musical culture. Handel’s beloved oratorio had many features of eighteenth-century opera. But Kelly reminds us that his innovative use of solo voices, chorus, and orchestra took an important step away from opera. The libretto was drawn from the Hebrew and Christian Bibles and the Anglican Prayer Book, rather than Greek mythology. It comes closer to theology than storytelling.

Ludwig Beethoven’s career spanned the years of the French Revolution and Empire. Prior to this cultural upheaval, his patrons and audience had been the landed and administrative elites of the Austrian Empire. By 1824, the year that he premiered the Ninth Symphony, his Viennese audience was mostly bourgeois. Beethoven was now free of the constraints of aristocratic patronage. But that also meant that he had to involve himself more fully in the business of music to make his living

The symphony as a musical form was, previous to Beethoven, shorter and less ambitious. The first movements of Mozart’s and Haydn’s symphonies had been the most important and longest, followed by three shorter pieces. Symphonies were generally performed first, not last, in the evening’s musical program with other works often inserted between the movements. The premier of the Ninth was the first occasion when voice was incorporated into the final movement. The performance of the Ninth confirmed Beethoven’s transformation of the symphonic form over his musical career.

The premier of Hector Berloiz’s Symphonie Fantastique in 1830 was only six years after Beethoven’s Ninth but the differences between the composers and their compositions are large. Berlioz was only 27. He had just won a major prize for composition. That, combined with his own hustle, resulted in his being able to draw talents from prestigious musical organizations in Paris for his premier performance. Berlioz resembles young celebrity performers of our own day. And this premier performance of Symphonie Fantastique would, in some ways, be recognizable to contemporary concert goers. In other ways, it seems like yesteryear.

Kelly’s description of the premier of Igor Stravinsky’s ballet score, Le sacre du printemps is unique to a specific cultural time, Paris, 1913. Stravinsky’s fellow Russians were coloring the Parisian cultural world. Their success centered on the Russian impresario, Sergy Diaghilev and the presence in Paris of many Russian composers, choreographers, and dancers. What went wrong at this first night has been much discussed. Vaslav Nijinsky’s choreography was controversial, moving Russian dance away from classical ballet gestures and movements. And Stravinsky was carrying orchestral music well along the way toward twentieth-century dissidence. Consequently on this first night the audience responded with boos, cat calls, and laughter, at times drowning out the orchestra.

 

Audiences have changed since that year, just over a century ago. The bourgeoisie audiences that dominate performances of classical music in the 21st century might at first hearing disregard the new composition, but our response would be polite if unenthusiastic applause. In a sense the audience has been erased, far different from the pro-active audiences of Stravinsky’s Paris.

This Republic of Suffering; Death and the American Civil War

Drew Faust. This Republic of Suffering; Death and the American Civil War by Drew Faust. Vintage, 2009 paper.

Drew Faust describes the ways in which death, on the scale that it was experienced in the Civil War, altered attitudes on an array of issues related to the end of life. These war casualties seemed unnatural to nineteenth-century Americans. The war killed young men in huge numbers, then the healthiest demographic and the least expected to die so young.

War deaths violated Victorian and Christian views on the Good Death. One expected to die at home, surrounded by loving family gathered around the death bed. There were last words; the dying person was fully cognizant of his impending death. All attending hoped that he or she would have an easy death. Generally some religious assurance was at hand.

Instead the bodies of the Civil War dead were scattered where they fell, often in pieces, amongst dead horses and mules. Unless quickly buried, they began to decompose. The dying were far from home and family and unattended in the final moments of life. It was a stranger’s death, in a strange land.

A good portion of the deaths were not battlefield casualties. Rather, soldiers died in field hospitals from their wounds, from gangrene, diarrhea, dysentery, and typhoid fever. Civil War prisoners were led off to an uncertain future in camps like Andersonville, Georgia. Nine percent of the Civil War dead died in prisoner-of-war camps.

Faust explains how doctors and nurses, commanding officers, and fellow soldiers attempted to restore some of the assurances of the Good Death. Letters of condolence were often written by someone who had witnessed the dying moments. If that weren’t possible, the family could be assured that their loved one had died for his country. “I have never witnessed such an exhibition of fortitude and Christian resignation as [your son, brother, husband] showed” would be a likely part of any letter. Some keepsake, perhaps a watch, diary, Bible, lock of hair, might be included in the condolence letter.

What circumstances of warfare in the 1860s shaped the ”republic of suffering?” While Civil War weaponry was more deadly than in previous wars, you were often still close enough to see the person that you were killing. Moreover most combatants were not professional soldiers. Slaying someone, even in warfare, was, for many, a transgression of the Sixth Commandment. As in most armies, a minority did most of the fighting, the rest avoided active combat.

Some individuals, however, acquired a taste for killing. Sharp shooters were particularly likely to do so. They generally killed at close range but hidden from view. When captured they were often shot. Black troops fighting in Northern armies were usually given no quarter.

Civil War soldiers did not have any official identification — no dog tags. They might carry on them a letter of identity with instructions about how to notify their next-of-kin. Neither army, at first, took identification of the dead seriously. Bodies were collected together, having been dragged across rough ground, and buried in mass graves. Thus a large portion of the enormous numbers of dead were never named, but simply declared missing in action. Grieving families would often travel to the battlefields to undertake a generally hopeless search for their loved one.

 

Had he died a ‘natural’ death, a person would have been buried amongst the cluster of his family graves in a churchyard or one of the municipal “garden” cemeteries being laid out in the north. To handle the numbers of dead and still provide some semblance of respect, national cemeteries were established alongside the great battlefields. At Gettysburg,for example. Their ordered rows of identical markers attest to the costs of the War, but also the anonymity of death.

The individual states, later and with great solemnity, erected monuments that now give character to these mass burial sites. Often the states would declare a day of remembrance that eventually coalesced around a national holiday, Memorial Day.

Drew Faust has taken an important piece of American history, previously overlooked or misunderstood, and given it focus and meaning. It won a National Book Award nomination. It is one of my favorite books