>> = Important Articles; ** = Major Articles
Liberty enlightening the world....
“Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door.”
by EMMA LAZARUS
By giving voice to the stoic and silent Statue of Liberty, Lazarus epitomizes how Jewish Americans and other immigrants hoped to shape their new country.
Emma Lazarus, “The New Colossus” (1883)
Emma Lazarus’ famous words, “Give me your tired, your poor,/Your huddled masses yearning to breathe free” may now be indelibly engraved into the collective American memory, but they did not achieve immortality overnight. In fact, Lazarus’ sonnet to the Statue of Liberty was hardly noticed until after her death, when a patron of the New York arts found it tucked into a small portfolio of poems written in 1883 to raise money for the construction of the Statue of Liberty’s pedestal. The patron, Georgina Schuyler, was struck by the poem and arranged to have its last five lines become a permanent part of the statue itself. More than twenty years later, children’s textbooks began to include the sonnet and Irving Berlin wrote it into a broadway musical. By 1945, the engraved poem was relocated—including all fourteen lines— to be placed over the Statue of Liberty’s main entrance.
Today the words themselves may be remembered a great degree more than the poet herself, but in Lazarus’ time just the opposite was true. As a member of New York’s social elite, Emma Lazarus enjoyed a privileged childhood, nurtured by her family to become a respected poet recognized throughout the country for verses about her Jewish heritage. A reader and a dreamer, Lazarus had the good fortune to claim Ralph Waldo Emerson as a pen-pal and mentor. Before her death at age 37, Lazarus grew from a sheltered girl writing flowery prose about Classical Antiquity to a sophisticated New York aristocrat troubled by the violent injustices suffered by Jews in Eastern Europe.
In “The New Colossus,” Lazarus contrasts the soon-to-be installed symbol of the United States with what many consider the perfect symbol of the Greek and Roman era, the Colossus of Rhodes. Her comparison proved appropriate, for Bartholdi himself created the Statue of Liberty with the well-known Colossus in mind. What Bartholdi did not intend, however, was for the Statue of Liberty to become a symbol of welcome for thousands of European immigrants. As political propaganda for France, the Statue of Liberty was first intended to be a path of enlightenment for the countries of Europe still battling tyranny and oppression. Lazarus’ words, however, turned that idea on its head: the Statue of Liberty would forever on be considered a beacon of welcome for immigrants leaving their mother countries.
Just as Lazarus’ poem gave new meaning to the statue, the statue emitted a new ideal for the United States. Liberty did not only mean freedom from the aristocracy of Britain that led the American colonists to the Revolutionary War. Liberty also meant freedom to come to the United States and create a new life without religious and ethnic persecution. Through Larazus’ poem, the Statue of Liberty gained a new name: She would now become the “Mother of Exiles,” torch in hand to lead her new children to American success and happiness.
Poem: The New Colossus
Not like the brazen giant of Greek fame
With conquering limbs astride from land to land;
Here at our sea-washed, sunset gates shall stand
A mighty woman with a torch, whose flame
Is the imprisoned lightning, and her name
Mother of Exiles. From her beacon-hand
Glows world-wide welcome; her mild eyes command
The air-bridged harbor that twin cities frame,
“Keep, ancient lands, your storied pomp!” cries she
With silent lips. “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore,
Send these, the homeless, tempest-tossed to me,
I lift my lamp beside the golden door!”
by Emma Lazarus, New York City, 1883
3 - 3 3 3 3 - 3 | 4 - - - - - -
4 | 5 5 5 5 5 5 5 5 | 6 - - - - - -
6 | 7 7 7 7 7 7 6 7 | 1 - - - - - - -
1 - 5 . 5 7 6 5 4 | 3 – 4 – 5 - - - (1 7 b7 6)
0 1 7 6 5 1 5 4 | 3 – 2 – 1 - - -
Statue of Liberty National Monument
Emma Lazarus’ Famous Poem
A poem by Emma Lazarus is graven on a tablet within the pedestal on which the statue stands.
The New Colossus
Not like the brazen giant of Greek fame, With conquering limbs astride from land to land;
Here at our sea-washed, sunset gates shall stand A mighty woman with a torch, whose flame Is the imprisoned lightning, and her name Mother of Exiles. From her beacon-hand Glows world-wide welcome; her mild eyes command The air-bridged harbor that twin cities frame.
“Keep ancient lands, your storied pomp!” cries she With silent lips.
“Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me, I lift my lamp beside the golden door!”
The New Colossus
Emma Lazarus wrote “The New Colossus” in 1883 for an art auction “In Aid of the Bartholdi Pedestal Fund.” While France had provided the statue itself, American fundraising efforts like these paid for the Statue of Liberty’s pedestal. In 1903, sixteen years after her death, Lazarus’ sonnet was engraved on a plaque and placed in the pedestal as a memorial.
The famous sonnet echoes many of the conflicting identities and ideals Lazarus dealt with in her own life. As an American author, she felt that ancient lands could keep their old traditions and “storied pomp.” At the same time, Lazarus invoked her ancient Greek ideals by transforming the “brazen giant “ into a “Mother of Exiles” who retains Greek majesty, beauty and defiance as a new Colossus. The compassion of the lines “huddled masses yearning to breathe free” welcomes the tired immigrants, but the following image of the “wretched refuse of your teaming shore” hints at the condescension these refugees were to suffer. And while this Mother of Exiles’ eyes command, and she stands alone beacon to all the world, she is still an ambiguous figure of power, speaking only with “silent lips.”
Struggling beneath the poem’s surface, these tensions- between ancient and modern, Jew and American, voice and silence, freedom and oppression- give Emma Lazarus’ work meaning and power. As James Russell Lowell wrote her, “your sonnet gives its subject a raison d’etre.”
JUST 102 words and phrases are needed to evoke the history of the 20th century. Wars and men on the moon, rock’n’roll and Aids are all there.
This exercise in condensation has been achieved by Collins Dictionaries, which claims to have drawn up a list of the words that define the century. They were chosen from those that became common usage and popped up in dictionaries and other reference books in each year.
Diana Treffry, publishing manager of Collins English Dictionaries, said: “We tried to choose words that came into being in a particular year which suggested a view of that time and ultimately of the century itself.”
One of the darkest shadows cast across our times is hinted at in the very first word chosen, radioactivity, which gained currency in 1896. Aspirin is the next entry.
The pain of the subsequent decades is well documented. Air raid and tank, Cheka, bolshie and fascism cover just the war and revolution years until 1919. The misery that followed can be traced through neutron, Gestapo, radar, Dam Busters and doodlebug. But the year when the Second World War ended is marked by Tupperware.
The grim catalogue continues with the deceptively cheery‑looking bikini and the Cold War expressions Big Brother, newspeak and Nato. Cultural Revolution is a reminder that the 1960s were not as frivolous as some might recall (or fail to recall). Within recent memory the optimism that accompanied the entry of Velvet Revolution into our lexicon was soon dispelled by ethnic cleansing.
At home, one might be forgiven for thinking that politics were dominated by socialism. The century opened with the emergence of Labour Party and the word for 1997 (tossed in as a bonus 102nd entry) is Blairite. Thatcherism, however, is not included, nor is feminism or women’s lib.
The extraordinary, head‑long advance of science is well charted, from fingerprint to robot, moonbuggy and the Information Superhighway. Medical advances, including penicillin and test‑tube baby, have been great. But Aids is a salutary reminder that we do not control nature as well as we conjure up acronyms to describe its more horrible manifestations. The 1950s was perhaps the jolliest decade of all. A Teddy Boy could go to a discothèque, listen to rock’n’roll, get stoned and take the Hovercraft home with only John Osborne’s angry young man to spoil the psychedelic party.
BY PHILIP HOWARD
HOW can we best take the measure of the 20th century? Collins Dictionaries claim to have spelt it out in 101 words and phrases, plus a bonus, coined in those years.
Many other ways have been used to mark time. The Romans measured out their centuries by the names of their consuls. The Greek system of chronology, using quadrennial Olympiads, lasted for more than 1,000 years. For six centuries after the birth of Christ, few people were conscious that they were living in the Christian Era. And when our present Christian (Common) Era was established, we managed to get the date of the birth of Christ between four and seven years wrong. So the method chosen by HarperCollins, the publishers, to describe a century by the words that came into English in each of its years is no odder than some. It is quite as accurate as the systems of folk memory and weather lore that most of mankind has used for its chronology for most of its history. For modern linguistics is nearly a science at least a pseudoscience. And the date of the emergence of a new word can be calculated quite as accurately as the harvests, inundations and frozen winters that were used to count the years before records and mass communication. Nevertheless, this is a highly subjective new system of chronology. Thousands of words are coined every year in English in its many national and regional dialects, and personal idiolects. Most of them, in the verbose specialities of science and computers, are meaningless outside those fields.
The general words selected draw a time chart of a terrible and wonderful century. Some are already historical and obsolete, and choosing them is necessarily problematic. But the conclusion is sound: we preserve the present for the future every time that we open our mouths.
Words of the century
Philip Howard gives his own definitions of HarperCollins’s words of the century.
1896 Radioactivity This word wears its derivation on its face. The spontaneous alteration of the nuclei of radioactive atoms. Alpha, beta and gamma radiation.
1897 Aspirin First used for headaches and fever, now said to prevent heart attacks. The trade name, now generic in Britain, was derived from the Greek.
1898 Krypton Greek for “hidden thing”. A colourless, odourless gas used in fluorescent lights and lasers. Superman’s planet and The Krypton Factor.
1899 Gamine An elfish young woman. French for a young street Arabess. Audrey Hepburn in Roman Holiday was the archetypal, unforgettable, adorable gamine.
1900 Labour Party Latin toil. Old English for “pain and grief”. New Labour is more chardonnay and avocado dip, less beer and sandwiches.
1901 Fingerprint No two fingerprints are the same. First used in India as a means of identifying suspects. Then in England and in thrillers. DNA is better.
1902 Teddy Bear Brooklyn candy store owner Morris Michtom made the first teddy bear of brown plush and named it for that great huntsman, Theodore Roosevelt.
1903 Tarmac The eponym of the paving material is John Loudon McAdam (1756‑1836), the Scottish engineer.
1904 Fifa The acronym for Federation Internationale de Football Association. Professional football is overpaid lions organised by donkeys and hyenas.
1905 Sinn Fein Irish republican movement founded about 1905. In Irish “we ourselves”. In practice it means murder, anarchy and madness.
1906 Suffragette A woman fighting for the right to vote. Latin suffragium means a voting tablet. Women’s efforts in the First World War persuaded British males that women could also be trusted with the vote.
1907 Allergy Special sensitivity of the body that makes it react, with an exaggerated response, eg, hay fever.
1908 Borstal A reformatory for “juvenile adults”. Toponym of the village near Rochester in Kent where this corrective penal method was introduced.
1909 Jazz Enough to form a jazz group are credited with lending their names to this word. A dancing slave called Jasper on a New Orleans plantation in 1825? Mr Razz, a conductor? Charles or Chaz, a Vicksburg drummer?
1910 Girl Guide Agnes Baden Powell’s organisation for girls. Originally with blue serge skirts, straw hats, haversacks and poles.
1911 Air Raid The first air raid dropped two hand‑held bombs from an Italian biplane on a Libyan airfield.
1912 Schizophrenia A psychotic disorder characterised by progressive deterioration of the personality. Its use to mean “indecisive” is a solecism.
1913 Isotope One of two or more atoms that have the same atomic number (same number of protons), but a different number of neutrons. Coined by Fred Soddy.
1914 Vorticism Short‑lived movement in British painting, begun by Wyndham Lewis in Blast. The avant‑garde against established culture.
1915 Tank Winston Churchill, who advocated its use, gave it the name tank as a cover to conceal the new weapon from spies.
1916 Dada International revolutionary movement in art and literature. The name, a hobbyhorse, was plucked at random.
1917 Cheka Bolshevik secret police. The acronym of the Russian words for Extraordinary Commission for Combating Counter‑Revolution, Sabotage and Speculation. Superseded by Ogpu, NKVD, the KGB and other ugly abbrevs.
1918 Bolshie/Bolshy Anybody tricky to handle or radical. A shortening of Bolshevik, first used as a disparaging adjective by D. H. Lawrence.
1919 Fascism Mussolini founded the Fasci di Combattimento in Milan in March. The fasces of rods with an axe for beheading were carried by a Roman lictor.
1920 Robot In Karel Capek’s play R. U. R. mechanical men made by the Rossum Universal Robots Corporation revolt. From the Czech robata “work”.
1921 Chaplinesque Characteristic of the first international film star, ie, sentimental, pathetic and on the side of the underdog.
1922 Gigolo A young man “kept” financially by an older woman in return for his attentions.
1923 Spoonerism Metathesis (accidental transposition of letters or syllables) named after the Rev William Archibald Spooner, dean and warden of New College, Oxford.
1924 Surrealism “Beyond realism”, the dominant force in Western art between the wars. It followed the Freudian theory of the unconscious.
1925 British Summer Time Introduced by the Daylight Saving Act, and now useless and irritating.
1926 Television Technically a barbarism. The correctly formed word would be either proculvision or teleopsis.
1927 Talkie Abbreviation for a talking as opposed to a silent film. The Jazz Singer starring Al Jolson singing Mammy was released in this year.
1928 Penicillin Alexander Fleming obtained it from a fungus. The Latin penicillus means a tuft of hairs. The asexual spores of the fungus are hairy.
1929 Maginot Line André Maginot, the French Minister of Defence, had this wall built along the eastern border of France but forgot about Belgium.
1930 Pluto The smallest planet. Its name of the Greek god of the Underworld, “the rich one”, was suggested by Venetia Burney (aged 11) of Oxford.
1931 Oscar Margaret Herrick, librarian of the Academy of Motion Picture Arts and Sciences, shown an Art Deco manikin: “He reminds me of my uncle Oscar.”
1932 Neutron An elementary particle, electrically uncharged. Modelled on electron, taken from the Latin neutralis “neither one thing nor the other”.
1933 Gestapo A contraction from Geheime Staats Polizei, the Nazi secret police agency.
1934 Belisha Beacon Black‑and‑white striped posts topped with yellow globes, introduced by Leslie Hore‑Belisha, to halt “mass murder” on the roads.
1935 Alcoholics Anonymous Founded by alcoholic Bill Wilson, based on bonding sessions and public confession of sins.
1936 Mickey Mouse Invented by Walt Disney, first called Mortimer Mouse: “He was my firstborn and how I achieved all the other things I ever did.”
1937 Surreal A back‑formation from the arts movement founded by Guillaume Apollinaire. The word has been degraded to mere waffle.
1938 Nylon Trade name for polyhexamethyleneadipamide, more economically called polymide. Coined by the Du Pont company.
1939 Walter Mitty An ordinary person who is the hero of vivid dreams and daydreams of adventures, as in Thurber’s The Secret Life of Walter Mitty.
1940 Jeep Utility vehicle. Some say from G. P. general purpose. Probably influenced by Eugene the Jeep in Elzie C. Seger’s comic strip Popeye.
1941 Radar A method of detecting the position and velocity of an object. Acronym of Ra(dio) D(etecting) a(nd) R(anging). It won the Battle of Britain.
1942 Robotics The science of robots. A science fiction word come to life. Asimov’s “Three Universal Laws of Robotics” are accepted as the rubric of SF.
1943 Dam Busters Nickname for the 617 Squadron of the RAF, which destroyed the dams with bombs that skimmed like ducks and drakes, and flooded the Ruhr. Bouncing bombs invented by Barnes Wallis. Dam Busters pupped Ghostbusters etc.
1944 Doodlebug A nickname for the V‑1, Buzz Bomb or Flying Bomb. When its fuel ran out, it fell. Give the terror that flies by day a silly name to disarm it.
1945 Tupperware Plastic containers sold at home parties. The name suggests Cupperware, but an American moulding engineer, Earl Tupper, saw the light.
1946 Bikini The beach garment was named for Bikini, an atoll in the Marshall Islands, where an atomic bomb test was carried out in July.
1947 Flying saucer Fanciful name given to various unidentified disc‑shaped objects reported in the sky. This prevalent modern myth refuses to go away, and is nurtured by television and film.
1948 Scrabble The most popular board game. You “scrabble” to pick your letter tiles. Alfred Butts, its inventor, first called it Criss‑Cross.
1949 Big Brother A person or system that controls people’s minds and lives. The dictator in Orwell’s 1984, which also gave us the ambiguities of Newspeak. 1950 Nato Acronym for N(orth) A(tlantic) T(reaty) O(rganisation). An organisation formed as a counterbalance to the Evil Empire of the Soviet Union.
1951 Discothèque French coinage, cf bibliothèque, for a record library. Then a place where records for dancing were played, thence disco.
1952 Stoned Each generation invents its slang for “under the influence of drink/drugs”. This came from the slangfield of US jazz, and means cannabis.
1953 Rock ‘n’ roll Blend of rhythm and blues and country and western, with exaggerated movements. Sexual metaphor: “My baby rocks me with one steady roll.”
1954 Teddy boy Rebellious youth, from the Edwardian styles they wore, especially the boys’ tightly fitted trousers and jackets. 1955 Lego Toy bricks, successor to Meccano. Coined by a carpenter from Danish leg godt “play well”.
1956 Angry Young Man Overt or implicit reference to John Osborne’s play Look Back in Anger, featuring the archetypal AYM, Jimmy Porter. Other AYMs were Colin Wilson, Kingsley Amis and John Braine.
1957 Psychedelic Of a drug, producing an expansion of consciousness through greater awareness. From Greek words for “breath or soul” and “visible”.
1958 Silicon chip A tiny wafer of semiconductor material. Hence Silicon Valley, the Santa Clara valley, SE of San Francisco, thick with microelectronic firms.
1959 Hovercraft A vehicle that can travel across land and water on a cushion of air, invented by Christopher Cockerell.
1960 Laser A device that can emit a fierce, bright beam of light. Acronym of l(ight) a(mplification by the s(timulated) e(mission of) r(adiation).
1961 Catch‑22 Heller’s Catch‑22 gave us this expression for a double bind. Pilots could be excused duty only if insane. But one who refused to fly was evidently sane.
1962 Montezuma’s Revenge Or the Aztec two‑step. A jocular term for diarrhoea, “the trots”. Montezuma was the last Aztec emperor, murdered by Cortès.
1963 Rachmanism Exploitation of slum tenants by unscrupulous landlords. Eponym of dishonour from Peter Rachman (1919‑62), such a London landlord.
1964 Moog synthesiser An electronic musical instrument. Appropriate eponym of its inventor, Robert Moog.
1965 Miniskirt A short skirt, starting at least 4in above the knees, symbolic of Swinging London in the Sixties, and associated with Mary Quant.
1966 Cultural Revolution A cultural and socialist movement in Communist China, to combat “revisionism”. Cf Red Guards and Mao’s Little Red Book.
1967 Pulsar A small, extremely dense star which rotates quickly and emits regular pulses of radio waves. Like Rowan Atkinson or Madonna. a Portmanteau of pul(sating) (st)ar.
1968 Fosbury flop A high‑jump technique invented by Dick Fosbury. Instead of the straddle or Western roll, one clears the bar headfirst and backwards.
1969 Moon buggy From the Apollo 11 moon landing. Cf splashdown and launchpad. Buggy was originally a horse carriage, then used for golf and babies.
1970 Butterfly effect According to chaos theory, a butterfly flapping its wings in one part of the world can set in train a reaction which may lead to a hurricane elsewhere.
1971 Workaholic One addicted to work. A portmanteau from work and alcoholic. The trendy suffix has produced shopaholic and chocoholic.
1972 Watergate The Watergate building in Washington DC contained the headquarters of the Democratic Party in the presidential election. Hence the burglary, Nixon’s resignation, and derivatives such as Camillagate and Squidgygate.
1973 VAT The value‑added tax, introduced by Anthony Barber, then Chancellor. An odious imposition intended to pick taxpayers’ pockets surreptitiously.
1974 Ceefax The first British teletext service, launched by the BBC. A whimsical coinage as if the viewer can “see facts”.
1975 Fractal A swirling shape generated by a computer by repeating a simple mathematical formula on different scales. The Latin fractus means broken.
1976 Punk rock Rock music with offensive lyrics and aggressive beat. Punk meaning “harlot” occurs in Measure for Measure.
1977 ERM The exchange‑rate mechanism, for stabilising countries within the EU, so that they can adopt the EMU or enjoy a Black Wednesday.
1978 Test‑tube baby Vivid media coinage for the medical term in vitro, “in glass”. Louise Brown was the first egg fertilised in a lab.
1979 Rubik cube Erno Rubik invented this infuriating puzzle of a cube seemingly formed by 27 smaller cubes, uniform in size but of various colours.
1980 Solidarity English rendering of Solidarnos\253c\253, the independent trade union movement in Poland led by Lech Walesa, and officially banned in 1982.
1981 SDP Social Democratic Party, the short‑lived breakaway from Labour founded by Roy Jenkins, David Owen, William Rodgers and Shirley Williams.
1982 CD The compact disc, which replaced gramophone records, vinyl, LPs and tapes, with long‑lasting sound and collectable packaging.
1983 Aids A(cquired) I(mmune) D(eficiency) S(yndrome). Originally known as G(ay) R(elated) I(mmune) D(isease). Has become the Black Death of our century.
1984 Yuppie Y(oung) U(pwardly mobile) or U(rban) P(rofessional): rich young professionals with more money than sense, characterised by red braces and bray.
1985 Glasnost Russian for “openness”. Gorbachev’s reforms, with perestroika (economic restructuring), were landslides that brought down the Soviet empire.
1986 Mexican wave Narcissistic ripple created by spectators standing up in successive waves, waving arms, and then sitting down. From World Cup in Mexico.
1987 PEP A Personal Equity Plan, or Everyman a Capitalist by investing in British companies and being excused capital gains and income tax.
1988 Acid house Funk‑based, electronically edited disco music with hypnotic sound effects, associated with hippy culture and the drug Ecstasy.
1989 Velvet Revolution The swift and orderly transition from dictatorship to democracy in Czechoslovakia, led by playwright and President Václav Havel.
1990 Crop circle Patterns such as rings appearing in growing cereal. Mini‑tornados? Electromagnetics? Hoaxers? Please, not aliens from outer space.
1991 Ethnic cleansing Euphemism for racial expulsion and massacre, as practised by Serbs, and in Rwanda and Zaïre.
1992 Clone Hello! Hello! Dolly! Dolly! But also stealing a cellular phone (clone) and giving it the same identity as an existing one.
1993 Information superhighway Network of computers transferring digital information at high speed, and us into the Brave New World of Cyberspace.
1994 National Lottery Camelot is not King Arthur but Mystic Meg, Anthea Turner and two quid down the drain every week for each gambling‑crazy Brit.
1995 Road rage Aggressive, sometimes murderous, temper of a motorist towards another road‑user, brought about by traffic congestion, envy and intolerance.
1996 Alcopop Alcoholic drink that tastes like a soft drink, unscrupulously created in order to attract children with sweet palates towards the hard stuff.
1997 Blairite Bambi, the Great Leader, tough and tender, caring but with a hard edge, the most popular leader in history at the cutting edge of waffle.
A friend notes that March 4, 2005, the day on which this is written, is exactly 72 years to the day later than the day on which Franklin Roosevelt took office as president and assured Americans that “the only thing we have to fear is fear itself.”
Exactly 72 years before that, Abraham Lincoln took the oath of office and appealed to “the mystic chords of memory.” Seventy-two years before that, George Washington took the oath of office as the new nation’s first president.
Lincoln began his address at Gettysburg with the words, “Four score and seven years ago” — a reminder that the span of time between the great battle and the Declaration of Independence was not much more than the biblical lifespan of “three score and ten.” We stand today one lifetime away from Franklin Roosevelt, two lifetimes from Abraham Lincoln, only three lifetimes away from George Washington.
1933, 1861 and 1789 were all turning points, moments when men mattered. Men of piercing intelligence and farsighted men of extraordinary and unusual character. Washington was a leader who knew he was setting precedents for a republic that he believed would some day be continental in extent; he purposively kept himself remote from his contemporaries and strove to control his raging temper. He unleashed the talents of the young Alexander Hamilton, whose Federalist papers had done so much to secure the ratification of the Constitution, to shape a strong and fiscally reliable federal government. But he alone had the prestige to hold the new republic together when it was split by partisans on either side of what was then a world war between monarchist Britain and revolutionary France.
Lincoln came to office facing the dissolution of the nation Washington had welded together a lifetime before. Little known to his fellow citizens, condescended to by his Cabinet, he pursued a zigzag course through military failure to success, from a promise to honor the laws that upheld slavery to the proclamation that slavery was no more, from savage battle to “malice toward none and charity to all.”
Another president might have left the Confederacy go — Lincoln’s predecessor thought he couldn’t do anything else — or might have compromised and readmitted the Southern states with slavery intact. Lincoln ensured that the promises made in the Declaration a lifetime before would be kept, if not fully in his lifespan then some day in a reunited United States.
In the lifetime between Lincoln’s first inauguration and Roosevelt’s, the nation grew into an industrial giant — swelled with immigrants as no nation had ever been — and into potentially the greatest military power on earth. But when Roosevelt took office, the economy was in a deathly downward spiral and demands for radical change were in the air.
Roosevelt’s economic policies, aimed at freezing the economy in place, have been faulted for prolonging the depression and choking off economic growth. But critics should also credit him for what he didn’t do. Roosevelt could have nationalized the banks in 1933; instead, he insured their depositors. He could have nationalized industries, as Woodrow Wilson had done in World War I. Instead, he regulated them, imposed unions on their managements and worked cooperatively with their executives in building the war industries into the arsenal of democracy that won World War II. In the process, the United States became the world’s greatest military power, with responsibilities it has been grappling with ever since.
It seems unlikely that George W. Bush will be a giant in history like the presidents inaugurated 72, 144 and 216 years before. The threats to the nation are not as great. Yet his presidency may come to be seen as another turning point, and one in which the president’s character, and the choices he need not have made but did make, could shape the nation for a lifetime to come.
Bush, as Yale’s John Lewis Gaddis has noted, has transformed American foreign policy more than any president since Roosevelt and has decided to wield America’s power proactively to advance liberty and democracy around the world. The recent advances toward democracy in the Middle East suggest he is on the side of history.
Bush is also working to transform government from the industrial era programs of Roosevelt’s day to post-industrial era programs and has had some successes — but how much is still uncertain. There will be turns in the road ahead, but Bush seems to be setting America on a course that was not inevitable and which could shape the nation for a lifetime to come.
Michael Barone is a senior writer for U.S.News & World Report and principal coauthor of The Almanac of American Politics.
President George W. Bush’s European schedule presented the White House with several difficult and complicated diplomatic questions. After all, the celebration of “V-E Day,” marking the end of World War II in Europe, was complicated by increased tensions with Russia and its neighbors. The President’s May 7 address in Riga, Latvia takes on an entirely new significance when we understand that the American president chose to speak in the capital city of one of the nations that had been enslaved by the Soviet Union for almost half a century.
Speaking on the eve of the 60th anniversary of V-E Day, President Bush first celebrated the anniversary of Adolph Hitler’s defeat. “The evil that seized power in Germany brought war to all of Europe,” the president stated, “and waged war against morality itself.” The president continued, “What began as a movement of thugs became a government without conscience, and then an empire of bottomless cruelty. The Third Reich exalted the strong over the weak, overran and humiliated peaceful countries, undertook a mad quest for racial purity, coldly planned and carried out the murder of millions, and defined evil for the ages. Brave men and women of many countries faced that evil, and fought through dark and desperate years for their families and their homelands. In the end, a dictator who worshipped power was confined to four walls of a bunker, and the fall of his squalid tyranny is a day to remember and to celebrate.”
Those were strong words, and the president could safely have ended there. After all, there is little risk in denouncing Hitler and celebrating the fall of the Third Reich.
But what the president said next was a significant departure from what American presidents had said in the past, and nothing less than a direct spear of criticism struck to the heart of the Soviet Union. After honoring the Baltic states for their struggle against tyranny, President Bush looked back, not only to the fall of the Third Reich, but to the foundation of the Cold War. “For much of Germany, defeat led to freedom. For much of Eastern and Central Europe, victory brought the iron rule of another empire. V-E Day marked the end of fascism, but it did not end repression. The agreement at Yalta followed in the unjust tradition of the Munich, the Molotov-Ribbentrop Pact. Once again, when powerful governments negotiated, the freedom of small nations was somehow expendable. Yet this attempt to sacrifice freedom for the sake of stability left a continent divided and unstable. The captivity of millions in Central and Eastern Europe will be remembered as one of the greatest wrongs of history.”
Characterizing the Yalta agreement as “one of the greatest wrongs of history” was an amazing admission for an American president. After all, it was President Franklin D. Roosevelt who met at Yalta with British Prime Minister Winston Churchill and Soviet dictator Joseph Stalin. The “Big Three” met in the Crimea as the war in Europe was coming to an end—but as Hitler still had a massive number of troops on the ground. In anticipation of victory, Roosevelt, Churchill, and Stalin looked to the future. Roosevelt was looking for a way to establish an international organization that would prevent another global catastrophe as well as a means of returning American troops within two years after victory. Churchill was working to establish France as a Continental great power that would relieve Britain—bankrupted by years of war—from the sole position of leadership in post-war Europe. Stalin was looking for far more. He arrived in Yalta determined to establish Soviet supremacy over Central and Eastern Europe—and he left the conference having achieved all of his goals.
The distance of six decades allows a more dispassionate reconsideration of what was really at stake at Yalta, and what really happened as a result of the agreements forged there. For years, many have criticized Roosevelt for refusing to use American military power to force Soviet troops out of Poland and other occupied territories. Others counter that such a move would have led to World War III—a war the American people were almost certainly unprepared and unwilling to fight. No doubt, the Soviets already controlled most if not all of the lands they were claiming. Nevertheless, the Yalta agreement led directly to the enslavement of millions of people and to the deaths of other millions in the Soviet gulags and the machinery of the Soviet system.
President Bush’s statements in Riga served to set the record straight. With amazing candor and remarkable grace, the American president returned to what is now a painful moment in our own nation’s history. “The end of World War II raised unavoidable questions for my country: Had we fought and sacrificed only to achieve the permanent division of Europe into armed camps? Or did the cause of freedom and the rights of nations require more of us? Eventually, America and our strong allies made a decision: We would not be content with the liberation of half of Europe—and we would not forget our friends behind an Iron Curtain. We defended the freedom of Greece and Turkey, and airlifted supplies to Berlin, and broadcast the message of liberty by radio. We spoke up for dissenters, and challenged an empire to tear down a hated wall. Eventually, Communism began to collapse under external pressure, and under the weight of its own contradictions. And we set the vision of a Europe whole, free, and at peace—so dictators could no longer rise up and feed ancient grievances, and conflict would not be repeated again and again.”
The president’s admission that the Yalta Agreement was “one of the greatest wrongs of history” represented both a diplomatic landmark and an act of moral courage. His tracing of American efforts to encourage the enslaved peoples of Europe reminds us all that tyranny, once established, must be confronted with both military and moral force.
Looking even beyond the president’s comments, Americans would do well to look back to Yalta with both embarrassment and moral analysis. As historian Arthur Herman recalls, the Yalta agreement was based upon several principles now revealed to be fallacious. As he writes, “The first of these fallacies was that collective security is more important than democracy and human rights.” Exhausted after years of war, the Allies sought to preserve their own interests rather than to press for human rights and the establishment of democratic governments. The Yalta agreements gave lip service to democracy, but were written with such elasticity that the Soviets could redefine democracy in their own terms. Herman points to the United Nations as the focus of the second Yalta fallacy: “Multilateral bodies can create common purpose among nations with conflicting interests.” As he argues, the framers of Yalta believed that “the United Nations would succeed where the League of Nations had failed. Instead, the U.N. would prove to be just another theater for superpower conflict over the ages—and by including two of Stalin’s puppet Soviet republics as members, Yalta fatally blurred the distinction between democratic and despotic regimes as legitimate voices of the ‘world community.’”
By the time the Allies met at Yalta, Churchill was growing exhausted and Roosevelt was nearing death. Of the three, only Stalin would continue in power and thus be able to exert his personal leadership. The absence of Churchill and Roosevelt from the world scene—in the case of Churchill at least for a time—gave Stalin virtually unrestricted opportunity to redefine the terms of the Yalta agreement. When V-E Day finally arrived on May 8, 1945, Roosevelt was dead. His successor as president, Harry S. Truman, inherited the legacy of Yalta. Speaking on V-E Day, Winston Churchill seemed to understand that Yalta meant disaster for much of Europe. In his radio broadcast, Churchill warned: “On the continent of Europe, we have yet to make sure the simple and honorable purposes for which we entered the war are not thrust aside or overlooked.” He understood that the defeat of the Third Reich would mean little, if totalitarian or police Governments were to take the place of the German invaders.
Nevertheless, even Churchill—who had long been concerned that Roosevelt underestimated Stalin’s intentions—believed that there was little the U.S. and Britain could do to prevent the Soviet conquest of occupied territories.
“It is not permitted to those charged with dealing with events in times of war or crisis to confine themselves purely to the statement of broad general principles on which good people agree,” Churchill recalled in his memoirs. “They have to take definite decisions from day to day. They have to adopt postures which must be solidly maintained, otherwise how can any combinations for action be maintained? It is easy, after the Germans are beaten, to condemn those who did their best to hearten the Russian military effort and to keep in harmonious contact with our great Ally, who had suffered so frightfully. What would have happened if we had quarreled with Russia while the Germans still had two or three hundred divisions on the fighting front? Our hopeful assumptions were soon to be falsified. Still, they were the only ones possible at the time.”
Sadly, Churchill may have been right. The assumptions of Yalta may have appeared to be “the only ones possible at the time.” They are not the only possible assumptions of our time, however. President George W. Bush set the record straight in characterizing the Yalta agreement as “one of the greatest wrongs of history.” The President demonstrated courage as he spoke these words to a nation that had fallen under Stalin’s iron grip. Let’s hope the world was listening.
R. Albert Mohler, Jr. is president of The Southern Baptist Theological Seminary in Louisville, Kentucky.
Some teachers appear to be larger than life, influencing successive generations of students with displays of erudition, inspiration, and a dash of drama. Professor Donald Kagan of Yale University is one of those teachers, and he delivered a lecture to the entire nation on May 12 as he presented the 2005 Jefferson Lecture in the Humanities.
Sponsored by the National Endowment for the Humanities, the Jefferson Lectures are the nation’s top prize in the humanities, and the list of previous lecturers makes this point clear. At the same time, like virtually everything said or done in Washington, the lectures carry a political dimension as well. Professor Kagan’s lecture, “In Defense of History,” was indeed a bold defense of history, delivered in the face of postmodern critics, deconstructionists, and cultural relativists.
Born in Lithuania in 1932, Kagan has taught at Yale since 1969. President George W. Bush presented him with a National Humanities Medal in 2002—a signal achievement for a man who was the first in his family to attend college.
Kagan is best known for his work on the Peloponnesian War and the history of classical civilizations. In his view, history is more than an interesting story or a battleground for competing ideologies—it is the ground from which we understand the present by understanding the past.
“Without history, we are the prisoners of the accident of where and when we were born,” Kagan has said. The study of history allows living persons to learn from those long dead and, by extension, to emulate their successes and avoid their failures.
This view of history is under assault in today’s academy—and particularly among the academic elites. For most of the last two decades, history departments have been hiring faculty members who, by and large, believe that no objective account of history exists, and thus that history is nothing more than a realm of competing ideologies and inconclusive debates.
In his Jefferson lecture, Kagan presented a bold defense of history—and the humanities—against the claims of the postmodernists.
“I come as a defender of the faith, of the humanities as they were understood ever since the invention of the concept many centuries ago,” Kagan announced. Without embarrassment, he cited the Renaissance humanist Pietro Paolo Vergerio, who argued that the humanities—the traditional liberal arts—represent “that education which calls forth, trains and develops those highest gifts of body and mind which ennoble men and which are rightly judged to rank next in dignity to virtue only, for to a vulgar temper, gain and pleasure are the one existence, to a lofty nature, moral worth and fame.”
That quotation from Vergerio announced that Kagan is determined to defend a hierarchy of values as learned from the ancients and understood by the study of history. The liberal arts were intended to train the intellect, in order “to produce an intrinsic pleasure and satisfaction” that would also benefit the larger community. This education is intensely moral, Kagan understands, intended to train the educated individual to be eloquent and wise and “to know what is good and to practice virtue, both in private and public life.”
In contrast, some postmodern critics deny that history has any objective “meaning” and that anything known as virtue even exists. As Kagan stated, many modern teachers in the humanities are those “who deny the possibility of knowing anything with confidence, of the reality of such concepts as truth and virtue, who seek only gain and pleasure in the modern guise of political power and self-gratification as the ends of education.” Those are fighting words, and Kagan delivered a stinging rebuke to the modern enemies of history.
“Among them it is common to reject any notion of objectivity, of truths arrived at by evidence or reasoning external to whims or prejudices,” he asserted. He aimed particular criticism at those who claim that history must be “deconstructed” by literary criticism. These critics “assert that all studies are literature, all, therefore subject to the same indeterminacy as all language.”
In the course of his lecture, Kagan considered the contrast between the classic understanding of the artist found in Greek civilization and the modern concept which he traced to the Romantic movement of the modern age. “Ever since the beginning of the Romantic movement the dominant belief has been that a true poet or artist, whatever his genre, must be a rebel against the established order of society,” Kagan asserted. “Writers of the past who don’t fit the model seem always to be merely the victims of their place in corrupt societies or stooges of those who ruled them. The modern critic who discovers this is, of course, free from such influences.”
In other words, critics who assail the writers of the past as being ideologically blind and ignorant of their own oppression simply assume or assert that they are themselves liberated from such constraints and limitations. The modern concept of the artist as rebel produced literature “that is shaped merely by its author’s time and his place within the society, by his prejudices and purposes” and “is a poor and weak thing that deserve(s) the social scientific analysis and pseudo-philosophical mumbo-jumbo that passed for literary criticism in our day.”
Kagan understands that we live in a time that is hostile to any claim for the value of history. The claim that history is important “has rested chiefly on its search for truths arrived at by painstaking research conducted with the greatest possible objectivity, explaining events by means of human reason.” This is precisely the understanding of history that is increasingly out of vogue in the modern academy.
Kagan is most at home with the ancients, conversing with Herodotus, Thucydides, and Livy. From the ancients, Kagan emerged with a coherent and ambitious understanding of the historian’s task. “These are the missions for the historian: to examine important events of the past with painstaking care and the greatest possible objectivity, to seek a reasoned explanation for them based on the fullest and fairest possible examination of the evidence in order to preserve their memory and to use them to establish such uniformities as may exist in human events, and then to apply the resulting understanding to improve the judgment and wisdom of people who must deal with similar problems in the future.”
The historian is more than a chronicler, Kagan insists. The historian’s singular task is to identify the truly important story—the events infused with meaning.
Herodotus, identified by Kagan as “the first true historian,” wrote of the war between the Greeks and the Persians “so that time may not blot out from among men the memory of the past, and that the fame of the great and marvelous deeds done by Greeks and foreigners may not be lost, and especially the reason why they fought against each other.”
That approach is what makes history both interesting and important, but the very idea of great events, great individuals, and great deeds is looked upon with condescension in today’s academic environment. As Kagan explains, “The traditional great events and subjects: high politics, constitutions, diplomacy, war, great books and ideas, are not to be considered, except to show why they must be excluded as the product of dead white males engaged in the permanent process of oppressing good ordinary people of one kind or another. The purpose of the enterprise is not to seek the truth with the greatest objectivity one can muster but to raise the consciousness of the oppressed, to bring them the self-esteem they will need to overthrow the current version of this ancient establishment.”
Kagan looks to his fellow historians for rescue from this postmodern predicament. Even though university historians “have given far too much ground to such mindlessness promoted by contemporary political partisanship,” Kagan believes that the historians “are better situated than their colleagues in the other humanities to recover their senses.”
In a very real sense, Professor Kagan was calling his fellow professors in the humanities to acknowledge a moral dimension to the liberal arts, to establish virtue as an honorable goal, and to affirm truth as something that is both real and knowable. In other words, Kagan was proposing a platform for moral recovery now that religion has faded in influence. As he explains, “If we cannot look simply to moral guidance firmly founded on religious precepts it is natural and reasonable to turn to history, the record of human experience, as a necessary supplement if not a substitute. History, it seems to me, is the most useful key we have to open the mysteries of the human predicament.”
That statement reveals both the glory and the futility of Professor Kagan’s approach. His defense of history and historical knowledge is intellectually brilliant and courageous. Nevertheless, his confidence that history can “open the mysteries of the human predicament” is disastrously misplaced.
In the Christian analysis, history, taken with full intellectual respect, unquestionably illustrates the human predicament. Christians can agree with the claim that history reveals moral lessons through the rising and falling of empires and the crucibles of human conflict.
Nevertheless, the “mysteries of the human predicament” are understood only by revelation. The humanities have their place—and a recovery of sanity in the liberal arts would be a tremendous cultural achievement—but the deepest truths about humanity can come only from our Creator and can be understood only against the backdrop of eternity.
Professor Donald Kagan’s brave address is worthy of serious attention. Americans should be encouraged to know that the National Endowment for the Humanities provided the occasion for such an important presentation of substantial ideas. For Christians, this event should serve as another reminder of why we, of all people, must look to history with respect, humility, and seriousness.
R. Albert Mohler, Jr. is president of The Southern Baptist Theological Seminary in Louisville, Kentucky.
Karl Marx is the greatest philosopher of all time. Or at least this is what many BBC Radio listeners suggested recently when asked to nominate such a person. To the surprise of some, Marx topped the poll, beating - by wide margins - thinkers ranging from Aristotle to Kant.
Marx wrote many things, including admiring words about capitalism which he regarded as a definite advance on previous economic arrangements. The BBC result, however, underlines a strange blindness about Marx persisting within Western societies.
In one sense, this is nothing new. In the 1930s, intrepid Westerners traveled to the U.S.S.R and returned saying that they had seen the future. Somehow they managed not to see the purges, the collectivization, and the gulags that resulted in the imprisonment and deaths of millions. Communism, it is often said, was a godless system. This is not quite right. Communism was godless insofar as it was based upon an atheistic vision of man. Yet Communism did have its gods. It had its deities to whom anything and anyone could be sacrificed.
One response is to claim that Marx’s philosophy was distorted by Lenin and Stalin. Marx himself, one often hears, was a humanist who wished to liberate people from their chains. Other apologists insist that one can distinguish between the young Marx and the old Marx: the youthful philosopher being more humanistic than the grayer, more callous political revolutionary.
Even cursory attention to Marx’s writings quickly reveals the hollowness of such defenses. A consistent anti-human vision features throughout Marx’s thought. For Marx, man is a being whose origins are irrelevant, whose future is extinction, and whose present is beyond his control. Even people living in Marx’s Communist society have no possibility of a meaningful existence. Marx once described Communist society as one in which it would be possible “to do one thing today and another tomorrow; to hunt in the morning, fish in the afternoon, breed cattle in the evening and criticize after dinner, just as I please.”
This sounds idyllic until one realizes that, from Marxism’s perspective, none of these activities can have any value for humans. For true materialists, there can be no qualitative difference between reading and fishing, working or sleeping, living or dying. Everything has the same value and therefore no value. In this world, there is no difference between Mother Teresa’s work and that of a concentration camp guard. They share equally in a general irrelevance of everything and everyone.
This tells us that Marxism cannot be interested in justice or liberty. It insists that we are like driftwood, floating hither and thither on the waves of history. In such a world, our lives matter naught. Our deaths are irrelevant. We merely try and salvage whatever animal satisfaction we can from life, before our essential nothingness is finalized in our ultimate annihilation as living beings.
So much for Marx’s humanism. A more serious problem with Marxist philosophy is its legitimatizing of criminality.
By “criminal,” I do not simply mean the occasional breaking of law. Rather, I mean a situation whereby people decide that they are above law; that they are not subject to law; that law is merely another tool of power. For if Marxism is right and materialism is true, then systematic violence to pursuit political goals is acceptable.
The irony is that while millions today know about the Nazis’ unspeakable crimes, rather fewer know about the atrocities committed by Lenin, Stalin, Castro, Pol Pot and other Marxists. It is as if there has been a subtle agreement not to discuss these crimes. This studied ignorance manifests itself when we observe red flags emblazoned with hammers and sickles waved at demonstrations. Do their wavers know what the red flag means for those who were enslaved and killed by Marxist regimes? Why is Marxism’s red flag not treated with the same contempt rightly attached to the swastika?
Marx, of course, died years before his followers managed to seize power. But one suspects that Marx would have applauded Communist use of violence. Marx himself advocated hanging capitalists from the nearest lamppost. “When our turn comes,” Marx warned his opponents, “we shall not disguise our terrorism.”
Much violence has been done in the name of philosophies and religions, including Christianity. The difference is that Christianity contains moral criteria according to which we can judge and condemn such activity on the part of Christians. Marxism never had and could never have such standards. For in Marxist philosophy, there is no place for love of God and love of neighbor. Perhaps that, above all, is what makes Marx so unworthy of contemporary admiration.
Dr. Samuel Gregg is Director of Research at the Acton Institute in Grand Rapids, Mich. He is the author of Economic Thinking for the Theologically Minded (University Press of America, 2001) and On Ordered Liberty: A Treatise on the Free Society (Lexington Books, 2003).
Our moral imagination is haunted by monsters—and the greatest aspect of this horror is the fact that so many monsters are real. Is the world ready to face the reality of Mao Zedong?
For the last seven decades or so, Mao has been a focus of admiration among many on the Left. Many Americans have known Mao primarily through the work of sympathetic biographers who became champions of the Chinese Communist regime. For many others, Mao has remained a man of mystery, whose true character and legacy have been hidden from Western eyes. All that is about to change. The publication of Mao: The Unknown Story by Jung Chang and her husband Jon Halliday will force a radical reformulation of Western understandings of Mao—and the book is virtually certain to exercise a vast influence within China as well.
Ms. Chang, author of the much-acclaimed novel Wild Swans, has—with her husband, historian Jon Halliday—produced a devastating analysis of Mao and his legacy. They do not present a pretty picture.
“I decided to write about Mao because I was fascinated by this man, who dominated my life in China, and who devastated the lives of my fellow countrymen,” Ms. Chang recounts. “He was as evil as Hitler or Stalin, and did as much damage to mankind as they did. Yet the world knows astonishingly little about him.”
Why would this be so? Writing in the October 2005 issue of Commentary, Arthur Waldron, Professor of International Relations at the University of Pennsylvania, draws a distinction between the popular rejection of Adolf Hitler and the celebration of Mao.
“The 20th century was remarkable not only for the number and scale of the atrocities it witnessed but also for the slowness with which these frightful events were recognized for what they were, let alone condemned,” Waldron observes. This was certainly true of the Holocaust, but Adolf Hitler is almost unanimously acknowledged as one of the greatest criminals in history. His name is met with revulsion, and those who would celebrate Hitler’s legacy are rightly considered the enemies of humankind.
Not so with Mao. As Waldron notes, “Today, no one in his right mind would put a portrait of Hitler in his house. Yet, in many places in the West, Mao kitsch—posters, badges, busts, and so forth—is still considered not only acceptable but even fashionable.”
Mao’s positive reputation in the West was made possible largely through the nefarious efforts of historians and writers who sacrificed the truth in order to further Mao’s interests. The prime example of this propaganda literature is Red Star Over China by journalist Edgar Snow. We now know that Snow was duped by Mao and that Maoist authorities edited the book in order to meet their own purposes. Beyond this, many of the events detailed in the book are now known never to have happened. As historian Keith Windschuttle recounts, Snow transformed the reputation of Mao and the Chinese Communists. “He portrayed Mao and his supporters as heroic figures, dedicated to liberating their country from both the foreign invaders and the hopelessly corrupt Nationalists.” According to Windschuttle, “Snow’s book played a major role in converting public opinion in both America and Europe towards a more favorable view of Mao. Its biggest impact, however, was in China itself, where it had a profound influence on radical youth.”
Edgar Snow would eventually be discredited as a journalist, and his book would be revealed to be little more than baseless propaganda. Nevertheless, the book remains in print and its impact continues.
Other leftist writers and figures joined Snow in praising Mao and his regime. John K. Fairbank, a Harvard professor, returned from a visit to China and remarked: “The Maoist revolution is on the whole the best thing that has happened to the Chinese people in centuries.” Feminist philosopher Simone de Beauvoir excused Mao’s murderous regime by arguing that “the power [he] exercises is no more dictatorial than, say, Roosevelt’s was.” John-Paul Sartre, de Beauvoir’s consort, celebrated Mao’s “revolutionary violence,” declaring it to be “profoundly moral.”
Waldron points to the fact that there has been no repudiation or reevaluation of Mao’s leadership within China. “China has never repudiated Mao as Khrushchev did Stalin at the Party Congress of 1956,” he notes. Mao’s face continues to shine over Tiananmen Square, and his cult of personality continues, even as his embalmed body remains the nation’s central object of veneration.
The official party line about Mao presents him as a liberator who emerged as the popular leader of a revolt against oppression, both foreign and domestic. The “Mao Myth” centers in claims of heroism during the “Long March” of 1934-1935, when Mao and his Communists supposedly fled from their base in the south of China to a refuge in the north.
Edgar Snow constructed the myth of the Long March in order to present Mao as an heroic figure who deserved popular support and foreign respect. As it turns out, the account was a total fabrication. Even the famous crossing of the suspension bridge over the Dadu River turns out to have been pure fiction.
This much is clear—Mao wasn’t counting on the opening of the Soviet State archives. Jung Chang and Jon Halliday have performed a massive feat of research, drawing from personal research, hundreds of interviews, and years spent researching historical documents—especially those released with the fall of the Soviet Union.
Now, as Ms. Chang makes clear, Mao is revealed as “the biggest mass murderer in the history of the world.”
That is quite a statement, of course. Yet, even by the murderous standards of the twentieth century, Mao emerges as the greatest murderer of them all. Chang and Halliday carefully document their claim that at least seventy million people died as a direct result of Mao’s policies. They died as victims of his cult of personality, and their lives were sacrificed to nothing more than Mao’s desire for bloodlust and personal power.
Reviewing the evidence, Arthur Waldron agrees: “Mao was the greatest mass murderer of the 20th century. Much of the killing was direct, as in the torture and purges at Yan’an. After the Communist seizure of power in 1949, the practice became countrywide. Mao set his numerical targets openly, and stressed the ‘revolutionary’ importance of killing.”
Li Rui, a former secretary to Mao, sent a paper to a conference held at Harvard University two years ago. She declared that “Mao was a person who did not fear death and he did not care how many were killed. Tens of millions of people suffered during every political movement and millions starved to death.”
Like so many other mass murderers, Mao developed a taste for killing. After watching peasants kill their landlords during an uprising in the late 1920s, Mao wrote a poem: “Watch us kill the bad landlords today. Aren’t you afraid? It’s knife slicing upon knife.” Mao suggested that the landlords be killed more slowly, in order to magnify their agony.
Being close to Mao didn’t help. When Chou Enlai, Mao’s closest associate, was diagnosed with bladder cancer, Mao insisted that Chou should never be told of the condition nor treated for it. Thus, Chou Enlai died slowly and painfully.
Mao’s cult of personality took programmatic shape in his erratic campaigns. The “Hundred Flowers campaign” was followed by the tragic “Great Leap Forward,” which was in turn followed by the “Cultural Revolution.” Eventually, all of these movements ended with murderous purges that removed any competitors to Mao’s personality cult.
Michael Yahuda, Professor Emeritus at the London School of Economics, provides one of the most concise descriptions of Mao and his legacy. “Mao had none of the skills usually associated with a successful revolutionary leader. He was no orator and he lacked either idealism or a clear ideology. He was not even a particularly good organizer. But he was driven by a personal lust for power. He came to dominate his colleagues through a mixture of blackmail and terror. And he seems to have enjoyed every minute of it. Indeed what he learned from his witnessing of a peasant uprising in his home province of Hunan in 1927 was that he derived a sadistic pleasure from seeing people put to death in horrible ways and generally being terrified. During the Cultural Revolution he watched films of the violence and of colleagues being tortured.”
The cult of Mao has continued, especially in the West, because the Left has never repudiated the man, his Party, and his tyrannical and murderous regime.
Within China, Mao is still presented as a great man (one Communist Party statement oddly judged Mao to be “70% good” and “a great Marxist.”).
Chang and Halliday have performed a tremendous public service in researching and writing this important book. As Arthur Waldron rightly observes, “This is the book that will wreck Mao’s reputation beyond salvage.” This can’t happen too soon.
Chang and Halliday begin their book with a simple declaration: “Mao Tse-Tung, who for decades held absolute power over the lives of one-quarter of the world’s population, was responsible for well over 70 million deaths in peacetime, more than any other twentieth-century leader.” We should be thankful that the truth is now known.
Mao: The Unknown Story is certain to be banned in China. Nevertheless, in today’s information economy, this book will be difficult to hide. If the truth ever gets out, China is likely to experience a genuine cultural revolution.
R. Albert Mohler, Jr. is president of The Southern Baptist Theological Seminary in Louisville, Kentucky.
Historians, to whose company I belong, are often taught to feel irrelevant. Survey after survey shows that most citizens know appallingly little about the past, including “their” past, the past on the basis of which they make decisions. Whether the fault is with us historians for doing our job badly or with publics for failing to pay attention is hard to discern. One point ought to be clear, however, in these days when our sub-cultures fight our sub-cultures and our “multi-” groups fight other “multi-” groups: Much of that fighting is about religion. In such encounters, historical accounts are often misrepresented, becoming inflaming sources of issues.
This week, a Wall Street Journal story by Daniel Golden showed just how tense debates are over how religious history is taught in public school textbooks (January 25). He described Hindu, Islamic, and Jewish groups complaining about and fighting over the way these texts treat their pasts. It is obvious that textbook writers, school boards, administrators, and teachers are “damned if they do and damned if they don’t” touch religion, and are pretty much damned however they do do religion. (For good background on the complexity of all this, read Kent Greenawalt’s Does God Belong in Public Schools? Greenawalt shows how hard it is to make judicial and judicious decisions on this subject).
A reminder: Those who criticize the United States Supreme Court for having been secular and a secularizer fail to note that the famed “school prayer decisions” of 1962-1963 — which ruled against using classrooms and school instrumentalities for devotional exercises, prayer, and the like — strongly urged that religion as such should be taught. Without knowledge of the religious past and religious peoples’ ways of doing things, how can we understand the present? we were asked. There are agencies that try to supply texts, but their books have not been adopted as widely as one might expect. Here’s one reason for this: In the end, most agitators for religion in the schools want their religion to be favored, privileged, and taught.
Golden points to interest groups in the various faiths, each of which has a point, and most of which overstate their cases. Hindus do not like reference to polytheism, the caste system, the inferior status of Indian women, and “sati” (the burning of widows on their husbands’ pyres). Some of the self-appointed agitators play rough, attacking scholars of Hinduism who do not satisfy them. We do not have space here to detail what Jews and Muslims have not liked, but it takes little imagination. And while Golden does not concentrate on them, some Christians have tried hardest to dictate how Christians get covered. Golden also portrays fair-minded scholars who do their best to tell the truth, but are caught in crossfires.
All this is fateful, since the decisions of boards in giant California and Texas markets come under every kind of pressure. If California and/or Texas votes “no” on a book, it stands little chance. A “yes” assures a market — but not a free ride, because someone will protest something in each book, and there’ll soon be another expensive revision. We are learning from this that you can’t satisfy everyone and that religion is not a “private affair” but always a hot topic in a republic where we cannot settle things, but have to live with messiness.
Occasional Reference Note:
We do regular sightings of religious events from Wall Street Journal news coverage. Readers who see quite accurately that paper’s editorial page as being conservative sometimes confuse the distinction between news and editorial bias there. This week I learned that Tim Groseclose, a political scientist, and Jeffrey Milyo, an economist, along with twenty-one research assistants, combed through ten years of U.S. media coverage and found “a systematic liberal bias” (see http://www.polisci.ucla.edu/faculty/groseclose/Media.Bias.pdf). But hold on: Using their scales and measurements, they announced that “one surprise is that the Wall Street Journal ... we find as the most liberal of all twenty news outlets,” and cited a 2002 survey which found it the second most liberal. So we cannot gain points with conservatives when we quote the Journal, just as we should not lose points with liberals who are suspicious of it. So there ....
Martin E. Marty’s biography, current projects, upcoming events, publications, and contact information can be found at www.illuminos.com.
Recent films from “The Passion of the Christ” to the fictional “Da Vinci Code” may have reignited the public’s interest in matters of faith, but a newly completed movie could go far beyond the impact of both blockbusters, potentially verifying the historical accuracy of much of the Bible.
The two-hour feature documentary titled “Exodus” has been in the making for five years, and is expected to be released in the spring of 2007.
It covers events recorded in the books of Genesis and Exodus, beginning with the exploits of Joseph, the son of Jacob who was betrayed by his brothers into slavery but eventually became the second most powerful man in the known world, predicting seven years of famine in Egypt. It then moves on to biblical accounts involving Moses, the plagues on Egypt, the parting of the Red Sea, and eventually Mount Sinai, where God is said to have given the Ten Commandments to the ancient Israelites.
“‘The Da Vinci Code’ is about fiction. We talk about reality,” said Dr. Lennart Moller, a Swedish DNA researcher at the Karolinska Institute in Stockholm, who stars in the film.
The film not only recounts what’s written in Scripture, but goes on a multinational mission to document the evidence for the events recorded in Scripture.
“We have been at places no one else has ever been. We have found things no one else has ever found,” Moller told WND.
Moller, who authored “The Exodus Case,” has been on a 10-year study, even looking into claims of ancient Egyptian chariot wheels found at the bottom of the Red Sea, but notes the claims of chariot wheels are just the beginning.
“There are much more sensational finds on land,” he said.
Appearing in the film with Moller is producer Tim Mahoney of the Mahoney Media Group in Minneapolis. He acts as sort of an “everyman” as he calls it, asking questions of Moller that the typical person on the street might have about the physical evidence discovered.
Mahoney says even though the film is a documentary, it has a lot more entertainment than typical TV documentaries, including 3-D animation and a symphony and choir recorded in Surround Sound in Budapest, Hungary.
They’ve even secured rights from the family of late director Cecil B. DeMille to show some footage from the 1956 classic “The Ten Commandments.”
Already the new film is being screened to select audiences, many of whom have no previous knowledge of the Bible.
“I’ve shown it to rabbis and pagans, and I’ve gotten excellent reviews from both,” said Mahoney.
Mahoney calls it the “most involved investigation” ever taken into the story of the Exodus, noting many other researchers seem to have little interest in determining the veracity of the biblical text.
“What we found out is that this subject is off-limits in academia. There’s not really an honest investigation into it,” he said. “They tend to be biased that it didn’t happen. This [film] is an unbiased look at the history and geography.”
Among the items featured in the film include finding the actual route the Israelites took when they were freed from slavery in Egypt and crossed the Red Sea.
While some scholars have alleged the Israelites crossed a “sea of reeds” and not the Red Sea, recent evidence suggests there was an actual crossing of the sea, beginning at a beachhead in Nuweiba, Egypt, and moving across the Gulf of Aqaba into Saudi Arabia on the other side.
It is there Mahoney and Moller believe is the real location of Mount Sinai.
“It is a military-protected area with machine-gun guards,” said Mahoney.
When asked the Saudi government’s rationale in closely guarding a mountain in the middle of nowhere, Mahoney replied, “Because it’s an archaelogical site that they don’t want people to get to.”
Moller believes the big movie companies have been “too scared” to take on this project because they consider the risks too high in the turbulent Middle East.
Producers are now in the courting stage with different companies interested in distributing and marketing the film.
In 1776, the first draft of the Declaration of Independence protested that Britain’s supposedly “Christian king” had “waged a cruel war against human nature” and violated “sacred rights of life & liberty” by enslaving Africans.
Further, it said, slaves often suffered “miserable death” in transit to America and King George had suppressed every attempt “to prohibit or to restrain this execrable commerce.”
The Continental Congress quickly deleted this moralistic language from a slave owner, Thomas Jefferson.
Students of history are regularly rewarded with such surprises. They’ll discover this one and many more in the college textbook Unto a Good Land: A History of the American People (Eerdmans). The 10-year production from six historians and 50 consultants covers American Indian life before Columbus through the 2004 election and war in Iraq.
No dry academic exercise, the flowing narrative makes this an enjoyable read for anyone seeking a broad overview of American history.
Historian A.J. Scopino at Central Connecticut State University says it’s “a splendid work of social and cultural history wherein religion earns its proper place.”
That religion aspect distinguishes Good Land from competitors. One cannot understand America and ignore its ever-present piety, so different from Europe. This textbook also fits the trend to treat the arts, science, minorities, women’s history and popular culture alongside the usual political and military power games.
Though Good Land is carefully nonsectarian and notes religion’s influence for both good (civil rights) and ill (witchcraft trials), it may prove a tough sell at secular universities.
Other random discoveries:
Though Columbus believed his explorations were divinely ordained, he nearly lost royal sponsorship because a committee of clergy, Spain’s only educated scholars, opposed him, but he was backed by a Franciscan friar who had Queen Isabella’s ear.
Ever wonder why Brazil became a Portuguese colony while Spain claimed the rest of South America? That division was worked out in response to the pope’s carving up of the world map the year after Columbus sailed.
In the early 1600s, King James denounced users of the “filthie noveltie” of tobacco for “sinning against God, harming your selves both in persons and goods.” Virginia’s governor fretted that farmers endangered their health by raising profitable tobacco instead of vegetables.
Up in Massachusetts, meanwhile, pioneer colonists were declaring that American Indians held property rights to any land they cultivated and maintained. The basis cited for this law? Genesis 1:28 and 9:1, and Psalm 115.
“The first individual to bring some degree of unity to the colonies was not a politician,” we’re told, but evangelist George Whitefield. He drew huge audiences from Boston to Georgia beginning in 1739.
The states approved the U.S. Constitution by a mere eyelash, and there was opposition to the rule that “no religious test shall ever be required” to hold public office.
As of 1827, the U.S. South had 106 anti-slavery societies compared with only 24 in the North, and Southern agitators outnumbered Northerners nearly four to one. Later, abolitionism swelled in the Northeast and “the primary motivation was religious.”
FDR’s New Deal was a big deal. But many of its ideas originated in prior decades with fervently Protestant presidential candidate William Jennings Bryan and the 1919 social reform platform from America’s Roman Catholic bishops.
Back to Jefferson. As a public official he championed freedom of conscience but personally held fervent religious opinions. He literally took scissors to the New Testament to delete miraculous parts he disliked while leaving moral teachings.
He decided to establish the University of Virginia because the College of William and Mary refused to abandon its Episcopal Church ties.
He predicted with wishful thinking that there wasn’t a youth living in America “who will not die a Unitarian,” oblivious to the emerging evangelical movement that has persisted in various forms to the present day.
By Charles Krauthammer
There has hardly been an Arab peace plan in the past 40 years — including the current Saudi version — that does not demand a return to the status quo of June 4, 1967. Why is that date so sacred?
Because it was the day before the outbreak of the Six Day War in which Israel scored one of the most stunning victories of the 20th century. The Arabs have spent four decades trying to undo its consequences.
The real anniversary of the war should be now, three weeks earlier. On May 16, 1967, Egyptian President Gamal Nasser demanded the evacuation from the Sinai Peninsula of the U.N. buffer force that had kept Israel and Egypt at peace for ten years. The U.N. complied, at which point Nasser imposed a naval blockade of Israel’s only outlet to the south, the port of Eilat — an open act of war.
How Egypt came to this reckless provocation is a complicated tale (chronicled in Michael Oren’s magisterial history Six Days of War) of aggressive intent compounded with fateful disinformation. An urgent and false Soviet warning that Israel was preparing to attack Syria led to a cascade of intra-Arab maneuvers that in turn led Nasser, the champion of pan-Arabism, to mortally confront Israel with a remilitarized Sinai and a southern blockade.
Why is this still important? Because that three-week period between May 16 and June 5 helps explain Israel’s 40-year reluctance to give up the fruits of the Six Day War — the Sinai, the Golan Heights, the West Bank and Gaza — in return for paper guarantees of peace. Israel had similar guarantees from the 1956 Suez War, after which it evacuated the Sinai in return for that U.N. buffer force and for assurances from the Western powers of free passage through the Straits of Tiran.
All this disappeared with a wave of Nasser’s hand. During those three interminable weeks, President Lyndon Johnson tried to rustle up an armada of countries to run the blockade and open Israel’s south. The effort failed dismally.
It is hard to exaggerate what it was like for Israel in those three weeks. Egypt, already in an alliance with Syria, formed an emergency military pact with Jordan. Iraq, Algeria, Saudi Arabia, Sudan, Tunisia, Libya, and Morocco began sending forces to join the coming fight. With troops and armor massing on Israel’s every frontier, jubilant broadcasts in every Arab capital hailed the imminent final war for the extermination of Israel. “We shall destroy Israel and its inhabitants,” declared PLO head Ahmed Shuqayri, “and as for the survivors — if there are any — the boats are ready to deport them.”
For Israel, the waiting was excruciating and debilitating. Israel’s citizen army had to be mobilized. As its soldiers waited on the various fronts for the world to rescue the nation from peril, Israeli society ground to a halt and its economy began bleeding to death. Army chief of staff Yitzhak Rabin, later to be hailed as a war hero and even later as a martyred man of peace, had a nervous breakdown. He was incapacitated to the point of incoherence by the unbearable tension of waiting with the life of his country in the balance.
We know the rest of the story. Rabin recovered in time to lead Israel to victory. But we forget how perilous was Israel’s condition. The victory hinged on a successful attack on Egypt’s air force on the morning of June 5. It was a gamble of astonishing proportions. Israel sent the bulk of its 200-plane air force on the mission, fully exposed to antiaircraft fire and missiles. Had they been detected and the force destroyed, the number of planes remaining behind to defend the Israeli homeland — its cities and civilians — from the Arab air forces’ combined 900 planes was ... 12.
We also forget that Israel’s occupation of the West Bank was entirely unsought. Israel begged Jordan’s King Hussein to stay out of the conflict.
Engaged in fierce combat with a numerically superior Egypt, Israel had no desire to open a new front just yards from Jewish Jerusalem and just miles from Tel Aviv. But Nasser personally told Hussein that Egypt had destroyed Israel’s air force and airfields and that total victory was at hand.
Hussein could not resist the temptation to join the fight. He joined. He lost.
The world will soon be awash with 40th-anniversary retrospectives on the war — and on the peace of the ages that awaits if Israel would only return to June 4, 1967. But Israelis are cautious. They remember the terror of that unbearable May when, with Israel possessing no occupied territories whatsoever, the entire Arab world was furiously preparing Israel’s imminent extinction. And the world did nothing.
JERUSALEM – Prime Minister Ehud Olmert and the government agency here charged with overseeing excavations may have violated Israeli law when they permitted Islamic authorities to conduct a massive dig on the Temple Mount using a bulldozer, WND has learned.
The Waqf, the Muslim custodians of the Temple Mount, are accused of destroying Temple Mount antiquities, including a possible wall from the Second Jewish Temple.
Israel’s Antiquities Authority agreed to allow bulldozers and other heavy equipment to dig a massive trench on the Temple Mount the Waqf claimed was necessary to replace electrical cables outside mosques on the site. The dig, which extended to most of the periphery of the Mount, was protected by the Israeli police and was supposed to be supervised by the Israeli government’s Antiquities Authority.
According to Israeli and Palestinian diplomatic sources, the directive to allow the dig originated from Olmert’s office.
Allowing the use of bulldozers at any sensitive archaeological site is extremely unusual, particularly at the Temple Mount, which experts say contains sealed layers of artifacts as shallow as two to three feet below the surface. The Mount has never been properly excavated. Heavy equipment could easily damage any existing artifacts, say experts, who assert the area should be excavated slowly and carefully by hand.
According to informed diplomatic sources, the Antiquities Authority did not grant the Waqf any official permit or written document allowing the Muslim custodians to dig on the Mount.
The sources said all agreements with the Wafq were oral, contravening Israeli law.
Further, according to Knesset regulations, any excavation on the Temple Mount requires the approval of a Knesset committee established in 2000 specifically to oversee digs on the sensitive Mount site.
But the committee was not consulted prior to the Waqf dig.
The Antiquities Authority and its director-general, Shmuel Dorfman, admitted it didn’t seek the required Knesset approval before allowing the Waqf to dig. He claimed Olmert’s office was not involved in the decision.
Dorfman further claimed no damage was done to any Temple antiquities during the dig, a statement dismissed as “absurd” by leading Mount archaeologists here.
““The [Israeli government] Antiquities Authority clearly and obviously allowed the destruction of antiquities,” charged third-generation Temple Mount archaeologist Eilat Mazar, speaking to WND. “What they did is the exact opposite of any proper archaeological supervision. Allowing a bulldozer to dig on the Mount is scandalous.”
Echoing Mazar’s comments, prominent Temple Mount archaeologist Gabriel Barkai told the Jerusalem Post: “The use of a bulldozer was like putting an elephant in a china shop. In such a sensitive spot, you cannot allow workers to use bulldozers. They should have dug by hand using special brushes and recorded every find scrupulously. I believe that serious damage was caused the moment they removed the earth, which was saturated with archeological findings.”
Muslims caught red handed
In September, after bulldozers dug a trench 1,300 feet long and five feet deep, the Muslim diggers came across a wall Israeli archaeologists believe may be remains of an area of the Second Jewish Temple known as the woman’s courtyard.
Israel, though, blocked leading archeologists from surveying the massive damage Islamic authorities are accused of causing to the purported wall. It refused to allow up Mazar and other prominent archaeologists during many attempts by the experts to inspect the Muslim dig.
In September, WND obtained a photo of the Waqf trench. In view in the picture, obtained in conjunction with Israel’s Temple Institute, are concrete slabs broken by Waqf bulldozers and what appears to be a chopped up carved stone from Jewish Temple-era antiquity.
Mazar confirmed the slabs were antiquity evidencing Temple-era attributes. She said inspection of the slabs was required to verify its authenticity.
The Waqf repeatedly denied it found or destroyed any Temple artifacts.
But on Sunday the Antiquities Authority released antiquities discovered by its archaeologists during what it said was an excavation coordinated during the Waqf dig. The released discoveries included fragments of bowl rims, bases and body shards, the base of a juglet used for the ladling of oil, the handle of a small juglet and the rim of a storage jar.
Mazar and other leading archaeologists speaking to WND today said they were “dumbfounded” the Antiquities Authority claimed any excavation was done during the Islamic dig.
“Perhaps finds were discovered in between the teeth of the Waqf bulldozers, but it’s ridiculous to say the Antiquities Authority supervised or conducted any proper dig,” said Mazar of Hebrew University. “No proper excavation is conducted with bulldozers. No one saw or reported any excavation. How can an excavation be conducted in secret? Such work is a big job. They are trying to hide their failure to stop the Islamic destruction.”
Mazar is also a fellow at Israel’s Shalem Center and a member of the Public Committee for Prevention of the Destruction of Antiquities on Temple Mount. Her much-discussed discovery in the City of David, a neighborhood just south of Jerusalem’s Old City Walls, is a massive building dating to the 10th century B.C. It is believed to be the remains of the palace of biblical King David, the second leader of a united kingdom of Israel, who ruled from around 1005 to 965 B.C.
The last time the Waqf conducted a large dig on the Temple Mount – during construction 10 years ago of a massive mosque at an area referred to as Solomon’s Stables – the Wafq reportedly disposed truckloads of dirt containing Jewish artifacts from the First and Second Temple periods.
After media reported the disposals, Israeli authorities froze the construction permit given to the Wafq, and the dirt was transferred to Israeli archaeologists for analysis. The Israeli authorities found scores of Jewish Temple relics in the nearly disposed dirt, including coins with Hebrew writing referencing the Temple, part of a Hasmonean lamp, several other Second Temple lamps, Temple-period pottery with Jewish markings, a marble pillar shaft and other Temple period artifacts. The Waqf was widely accused of attempting to hide evidence of the existence of the Jewish Temples.
CAMBRIDGE, Massachusetts: It is a theory that gives indigestion to mainstream archaeologists. Namely, that some of the immense blocks of the Great Pyramids of Egypt might have been cast from synthetic material - the world’s first concrete - not just carved whole from quarries and lugged into place by armies of toilers.
Such an innovation would have saved millions of man-hours of grunting and heaving in construction of the enigmatic edifices on the Giza Plateau.
“It could be they used less sweat and more smarts,” said Linn Hobbs, professor of materials science at the Massachusetts Institute of Technology.
“Maybe the ancient Egyptians didn’t just leave us mysterious monuments and mummies. Maybe they invented concrete 2,000 years before the Romans started using it in their structures.”
That is a notion that would dramatically change engineering history.
It has long been believed that the Romans were the first to employ structural concrete in a big way, although the technology may have come from the Greeks.
A handful of determined materials scientists are carrying out experiments with crushed limestone and natural binding chemicals - materials that would have been readily available to ancient Egyptians - designed to show that blocks on the upper reaches of the pyramids may have been cast in place from a slurry poured into wooden molds.
These researchers at labs in Cambridge, Philadelphia and St. Quentin, France, are trying to demonstrate that Egyptians of about 2,500 B.C. could have been the true inventors of the poured substance that is humanity’s most common building material.
At MIT, Hobbs and two colleagues teach a course called Materials in Human Experience. Over the years, undergraduates in the program have recreated from scratch such artifacts as samurai swords, tinkling Meso-American bells and even a swaying 60-foot, or 20-meter, plant-fiber suspension bridge like those built by the Incas.
Now a scale-model pyramid is rising in Hobbs’s sixth-floor lab, a construction made of quarried limestone as well as concrete-like blocks cast from crushed limestone sludge fortified with dollops of kaolinite clay, silica and natural desert salts - called natron - like those used by ancient Egyptians to mummify corpses.
The MIT pyramid will contain only about 280 blocks, compared with 2.3 million in the grandest of the Great Pyramids. And no whips cracked overhead last week as Myat-Noe-Zin Myint, Rachel Martin and three other undergraduates stuffed quivering, just-mixed “Egyptian” concrete into cobblestone-sized wooden molds marked “King Tut Plywood Co.”
“It feels like Jell-O but will turn rock-hard,” Myint said of the sharp-smelling concoction.
The aim of the class is to teach engineering innovation, but the project may also prove that ancients, at least in theory, could have cast pyramid blocks from similar materials, which would have been available from dried river beds, desert sands and quarries.
Hobbs described himself as “agnostic” on the issue but said he believed mainstream archaeologists had been too contemptuous of work by other scientists suggesting the possibility of concrete.
“The degree of hostility aimed at experimentation is disturbing,” he said. “Too many big egos and too many published works may be riding on the idea that every pyramid block was carved, not cast.”
Archaeologists, however, say there is simply no evidence that the pyramids are built of anything other than huge limestone blocks. Any synthetic material showing up in tests - as it has occasionally, even in work not trying to prove a concrete connection - is probably just slop from “modern” repairs done over the centuries, they say.
“The blocks were quarried locally and dragged to the site on sleds,” said Kathryn Bard, an Egyptologist at Boston University and author of a new book, “An Introduction to the Archaeology of Ancient Egypt.”
“There is just no evidence for making concrete, and there is no evidence that ancient Egyptians used the stuff,” she said.
The idea that some pyramid blocks were cast of concrete-like material was aggressively advanced in the 1980s by the French chemical engineer Joseph Davidovits, who argued that the Giza builders had pulverized soft limestone and mixed it with water, hardening the material with natural binders that the Egyptians are known to have used for their famous blue-glaze ornamental statues.
Such blocks, Davidovits said, would have been poured in place by workers hustling sacks of wet cement up the pyramids - a decidedly less spectacular image than the ones popularized by Hollywood epics like “The Ten Commandments,” with thousands of near-naked toilers straining with ropes and rollers to move mammoth carved stones.
“That’s the problem, the big archaeologists - and Egypt’s tourist industry - want to preserve romantic ideas,” said Davidovits, who researches ancient building materials at the Geopolymer Institute in St. Quentin.
In 2006, research by Michel Barsoum at Drexel University in Philadelphia found that samples of stone from parts of the Khufu Pyramid were “microstructurally” different from limestone blocks.
Barsoum, a professor of materials engineering, said microscope, X-ray and chemical analysis of scraps of stone from the pyramids “suggest a small but significant percentage of blocks on the higher portions of the pyramids were cast” from concrete.
He stressed that he believes that most of the blocks in the Khufu Pyramid were carved in the manner long suggested by archaeologists. “But 10 or 20% were probably cast in areas where it would have been highly difficult to position blocks,” he said.
Barsoum, a native of Egypt, said he was unprepared for the onslaught of angry criticism that greeted peer-reviewed research published two years ago by himself and his fellow scientists, Adrish Ganguly of Drexel and Gilles Hug of the National Center for Scientific Research in France.
“You would have thought I claimed the pyramids were carved by lasers,” Barsoum said.
Ancient drawings and hieroglyphics are cryptic on the subject of pyramid construction. Theories as to how the Egyptians might have built the huge monuments to dead pharaohs depend heavily on conjecture based on remnants of rubble ramps, as well as evidence that nearby limestone quarries contained roughly as much stone as is present in the pyramids.
Zahi Hawass, head of the Supreme Council of Antiquities in Egypt, minced no words in assailing the concrete idea. “It’s highly stupid,” he said via a spokesman. “The pyramids are made from solid blocks of quarried limestone. To suggest otherwise is idiotic and insulting.”
Hobbs and his students are undismayed by the controversy.
“It’s fascinating to think that ancient Egyptians may have been great materials scientists, not just great civil engineers,” Hobbs said.
“None of this lessens the accomplishments of the ancient Egyptians, although I suppose pouring concrete is less mysterious than moving giant blocks. But it really just suggests these people accomplished more than anyone ever imagined.”
BERLIN — It is only a breathless Hollywood script: treasure-hunter Indiana Jones races with German archaeologists to track down the fabled Ark of the Covenant, the chest that held the stone tablets on which the Ten Commandments were etched.
Now German researchers claim to have found the remains of the palace of the Queen of Sheba — and an altar that may have held the Ark.
The discovery, announced by the University of Hamburg last week, has stirred skeptical rumblings from the archaeological community.
The location of the Ark, indeed its existence, has been a source of controversy for centuries.
Regarded as the most precious treasure of ancient Judaism, it is at the heart of a debate about whether archaeology should chronicle the rise and fall of civilizations or explore the boundaries between myth and ancient history.
Professor Helmut Ziegert, of the archaeological institute at the University of Hamburg, has been supervising a dig in Aksum, northern Ethiopia, since 1999.
“From the dating, its position and the details that we have found, I am sure that this is the palace,” he said.
The palace, that is, of the Queen of Sheba, who is believed to have lived in the 10th century B.C.
After she died, her son and successor, Menelek, replaced the palace with a temple dedicated to Sirius.
The German researchers believe that the Ark was taken from Jerusalem by the queen — who had a liaison with King Solomon — and built into the altar to Sirius.
“The results we have suggest that a Cult of Sothis developed in Ethiopia with the arrival of Judaism and the Ark of the Covenant, and continued until 600 A.D.,” an announcement by the University of Hamburg on behalf of the research team said.
Sothis is the ancient Greek name for the star Sirius.
The Ark was made, according to the Bible, of gold-plated acacia wood and topped with two golden angels. It is said to be a source of great power. In about 586 B.C., when the Babylonians conquered the Israelites, the Ark vanished.
For many centuries finding it has been one of the great quests — inspiration not only for the 1981 film “Raiders of the Lost Ark,” but also for countries seeking to position themselves in the mainstream of ancient civilization.
Many archaeologists believe that their profession should not be in the business of myth-chasing. Even if the Ark were found, it would be impossible to establish scientifically whether it was the original receptacle for the Ten Commandments.
Iris Gerlach of the German Archaeological Institute in Sanaa, Yemen, believes the religious centre of Sheba is in present-day Yemen.
Although she does not go head-to-head with her colleague Professor Ziegert, the message is clear: A relic such as the Ark would have been stored in an important religious city rather than in Aksum.
Quest goes on
— The location of the Ark has been put in Egypt, Zimbabwe and even Ireland, where the Hill of Tara was excavated
— The Ethiopian holy town of Aksum is regarded as a more credible site
— Ethiopians believe that it is defended by monks in the church of St. Mary of Zion and is seen only by the guardian of the Ark, making it impossible to verify
At least part of the mystery of Stonehenge may have now been solved: It was, from the beginning, a monument to the dead.
New radiocarbon dates from human cremation burials in and around brooding stones on Salisbury Plain in England indicate that the site was used as a cemetery from 3000 B.C. well into its zenith around 2500 B.C., British archaeologists reported Thursday.
What appeared to be the head of a stone mace, a symbol of authority, was found with one of the burials, the archaeologists said, indicating that this was probably a cemetery for the ruling dynasty responsible for erecting Stonehenge.
“It’s now clear that burials were a major component of Stonehenge in all its main stages,” said Mike Parker Pearson, an archaeologist at the University of Sheffield in England.
In a teleconference with reporters, arranged by the National Geographic Society, Parker Pearson described the three burials of burned bones and teeth that had been dated in recent weeks. Researchers estimated that as many as 240 people had been buried there, all of them cremated.
Other evidence from the British Isles shows that skeletal burials were rare at this time and that cremation was the custom for the elite.
Another Sheffield archaeologist, Andrew Chamberlain, noted one reason to think that Stonehenge burials were for generations of a single elite family. The clue, he said, is the small number of burials in the earliest period and the larger numbers in later centuries, as offspring would have multiplied.
Given the monumental surroundings, Parker Pearson said, “One has to assume anyone buried there had some good credentials.”
The earliest burial to be tested came from a pit at the edge of the stone monuments; it dates to about 3000 B.C. The second burial dates to about 2900 B.C. The most recent one is from about the time the first arrangements of stones appeared on the plain, about 2500 B.C. It was previously believed that this had been a burial site for only a century after 2700 B.C., thus well before the distinctive large stones were put in place.
Parker Pearson said that finding more datable burials was “a huge priority” of the Stonehenge Riverside Project, which has been excavating the site for eight years. The National Geographic Society is a supporter of the research, and some of the results, other than the burial dating, are reported in the June issue of its magazine.
Although some of the cremated remains were uncovered decades ago, Parker Pearson said, it is only in recent years that improved methods of radiocarbon dating made it possible to analyze burned bones.
In other recent findings at Stonehenge and adjacent sites, archaeologists uncovered a piece of a red deer antler that apparently had been used as a pick for digging. It was found in what is known as the Stonehenge Greater Cursus, a cigar-shaped ditched enclosure nearly two miles, or three kilometers, long, and it is thought to have a sacred significance.
Julian Thomas, an archaeologist at the University of Manchester who led this investigation, said the antler had been dated to 3630 B.C. to 3375 B.C. That puts the cursus about 1,000 years before the large stones were erected, meaning, he said, that “this landscape maintains its significance over a long period of time.”
LONDON — The first excavation of Stonehenge in more than 40 years has uncovered evidence that the stone circle drew ailing pilgrims from around Europe for what they believed to be its healing properties, archaeologists said Monday.
Archaeologists Geoffrey Wainwright and Timothy Darvill said the content of graves scattered around the monument and the ancient chipping of its rocks to produce amulets indicated that Stonehenge was the primeval equivalent of Lourdes, the French shrine venerated for its supposed ability to cure the sick.
An unusual number of skeletons recovered from the area showed signs of serious disease or injury. Analysis of their teeth showed that about half were from outside the Stonehenge area.
“People were in a state of distress, if I can put it as politely as that, when they came to the Stonehenge monument,” Darvill told journalists assembled at London’s Society of Antiquaries.
He pointed out that experts near Stonehenge have found two skulls that showed evidence of primitive surgery, some of just a few known cases of operations in prehistoric Britain.
“Even today, that’s the pretty serious end of medicine,” he said.
Also found near Stonehenge was the body of a man known as the Amesbury Archer, who had a damaged skull and badly hurt knee and died around the time the stones were being installed. Analysis of the Archer’s bones showed he was from the Alps.
Darvill cautioned, however, that the new evidence did not rule out other uses for Stonehenge.
“It could have been a temple, even as it was a healing center,” Darvill said. “Just as Lourdes, for example, is still a religious center.”
The archaeologists managed to date the construction of the stone monument to about 2,300 B.C., a couple of centuries younger than was previously thought.
It was at that time that bluestones — a rare rock known to geologists as spotted dolomite — were shipped by hand or by raft from Pembrokeshire in Wales to Salisbury Plain in southern England, to create the inner circle of Stonehenge.
The outer circle, composed of much larger sandstone slabs, is what most people associate with the monument today, particularly since only about a third of the 80 or so bluestones remain.
The scientists argued that they were once at the heart of Stonehenge, and closely associated with its healing properties.
As evidence, Darvill said his dig had uncovered masses of fragments carved out of the bluestones by people to create amulets. Any rock carried around in such a way would have had some sort of protective or healing property, he said.
He said that theory was backed by burials in southwest England where the stones were interred with their owners.
Today the bluestones are now largely invisible, dwarfed by the huge sandstone monoliths — or “hanging stones” — that were erected later and still make up Stonehenge’s iconic profile.
“They are of course quite impressive when you see them,” Darvill said. “But in a sense they are the elaboration of a structure which kicked off with the bluestones.”
Both archaeologists quoted the 12th-century monk Geoffrey of Monmouth as saying the stones were thought to have medicinal properties. They also said that evidence uncovered by their dig showed that people were moving and chipping off pieces of the bluestones through the Roman period and even into the Middle Ages.
Darvill said he felt the “folklore interest” in the bluestones into modern times suggested some sort of lingering memory of their supposed healing powers.
“That would be for me the single strongest piece of evidence,” he said.
Andrew Fitzpatrick, from British heritage group Wessex Archaeology, said Darvill and Wainwright’s discovery was “very important” but that the healing theory, while plausible, was not the only one.
“I don’t think we can rule out the other main competing theory — that the temple was a meeting point between the land of the living and the dead,” he told the British Broadcasting Corp.
The scientists announced their findings Monday, ahead of a documentary due to air on the BBC and the Smithsonian Channel on Saturday, Sept. 27.
King Arthur and his knights, depicted in a medieval painting sitting around the famous round table. Modern historians claim the round table may have been located in a Roman amphitheater.
King Arthur, Lancelot, and the other knights of the round table are more than mere stories. In fact, one British historian has found precisely where that famous table once sat —and what exactly it was.
According to the Camelot historian, the famous table was no table at all.
He claims the circular interior of a former Roman amphitheater in Chester, England, was where the knights convened, and will reveal all the details of his discoveries in “King Arthur’s Round Table Revealed,” which airs on The History Channel July 19.
Historian Chris Gidlow said Arthur would have reinforced the building’s 40-foot walls to create an imposing and well-fortified base. The king’s regional noblemen would have sat in the central arena’s front row, with lower-ranked subjects in the outer stone benches.
Arthur has been the subject of much historical debate, but many scholars believe him to have been a 5th or 6th century leader. The legend links him to 12 major battles fought over 40 years — and one of his principal victories was said to have been at Chester.
Researchers say the recent discovery at the amphitheater of an execution stone and a wooden memorial to Christian martyrs suggests the missing city is Chester.
“The first accounts of the Round Table show that it was nothing like a dining table but was a venue for upwards of 1,000 people at a time,” said Gidlow.
“In the 6th century, a monk named Gildas, who wrote the earliest account of Arthur’s life, referred both to the City of the Legions and to a martyr’s shrine within it,” he explained. ‘That’s the clincher. The discovery of the shrine within the amphitheater means that Chester was the site of Arthur’s court — and his legendary Round Table.”