Entries from April 1997

1917 And All That

David Frum April 28th, 1997 at 12:00 am Comments Off

You open a magazine and there’s an advertisement — for blue jeans, for perfume, for a radio station, it could be anything. The ad copy says something like, “Breaking all the rules.”

Our culture, high and low, is suffused with a gleeful contempt for traditional forms of authority and traditional standards. This contempt inspires rock videos and the proceedings of the Modern Language Association; it can be seen in situation comedies and dictionaries. And yet, paradoxically, one can at the same time sense in contemporary America a desperate hunger for rules and standards. This hunger has made millionaires of the popular authors and broadcasters who can speak to it — William Bennett, Judith Martin, Laura Schlessinger, and the two New York area women who composed The Rules. Across the country, aspiring politicians have won thousands of elections to school boards, to district attorney’s offices, to Congress by promising stricter and tougher enforcement of moral norms. Intellectuals as diverse as communitarian philosopher Michael Sandel and immigration critic Peter Brimelow have tried, each in his own way, to formulate some new vision of a coherent America. President Clinton himself won his uphill battle in 1992 by identifying himself with a “forgotten middle class” that “plays by the rules.”

Human beings yearn for rules to live by. And yet it’s equally clear that these rules — whether of language, of aesthetics, of etiquette, of academic excellence, of the most fundamental areas of morality — aren’t there, that they haven’t been there for a very long time, and that they therefore cannot be enforced. The Supreme Court has gone so far as to claim that the very idea of agreed-upon norms may contravene the fundamental promises of the American Constitution. It said in the landmark 1992 case Planned Parenthood v. Casey, “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.”

The problem is, of course, that most of us simply aren’t capable of defining our own concepts of meaning. We take them secondhand or third-hand from others — from the leaders of our society, from our elites. John Maynard Keynes described this transmission of values in his characteristically blunt way: “Civilization was a thin and precarious crust erected by the personality and will of the few, and only maintained by rules and conventions skillfully put across and guilefully preserved.” No wonder, then, that so many feel their civilization is corroding. American and European elites have declined to do their job of putting across and preserving the rules and conventions most people yearn for. It’s not that our elites have ceased to govern. It’s that they govern in a curious, arguably
unprecedented, way. They present themselves less as elites than as anti-elites. They rest their legitimacy — their right to rule in politics, their right to lead intellectual opinion, their right to decide aesthetic questions, their right to construe the law, their right to instruct the young — not on their ability to interpret and preserve society’s inherited rules, but on their eagerness to emancipate their fellow citizens from those inherited rules. It has been the great theme of the 20th century, and it threatens to dominate the 21st as well.

Moral certainty has been ebbing out of our culture for a very long time. One can trace the loss as far back as one wants to go. Matthew Arnold heard the “melancholy, long, withdrawing roar” of the sea of faith at Dover Beach 130 years ago. Or one can put the finger on a time much closer to us: For Americans, that would be the 1950s and 1960s, when the disaster in Vietnam and complicity with segregation discredited their parents in the eyes of today’s forty- and fifty-something baby boomers. And yet despite all the complexities and subtleties of history, perhaps one can discern a single epicenter of the great shock that still convulses the culture of the Western world: April 1917, 80 years ago this month, when President Wilson took the United States into the war raging in Europe.

The Great War was fought a long time ago, and since then our century has suffered no end of atrocities, some of them even more appalling than the slaughter in the trenches. But it is not  wrong, I think, to continue to see the war as the central event of modern times, the caesura that cleaves the Western world’s 200-year-old experiment with bourgeois civilization precisely in half.

On our side of the divide, all is flux and uncertainty. On the other side lies a world in many ways more constricting than our own, but also in many ways more creative and successful. Tally up the great cultural achievements of modern times, and count how many of them we owe to people who came of age before 1914 and how comparatively paltry are the accomplishments of their successors born after 1900. It’s as if some great wave of human genius took shape in the 1880s and ’90s and crested in the 1910s and ’20s, bequeathing us Proust and Picasso, James Joyce and Albert Einstein, T. S. Eliot and Sergei Eisenstein, Mann and Matisse, Rilke and Yeats, Frank Lloyd Wright and Maxim Gorky. Since then, our culture seems to have lost its force, the attainments of each subsequent decade more wan and mediocre than those of the decade before. It’s as if by winning their freedom from Victorian restrictions, the last generation of Victorians extinguished their most powerful inspiration.

Unquestionably, the 19th century was an almost comically rule-bound age. A woman calling on a married couple must leave behind two of her husband’s cards and one of her own. Poetry must rhyme. The shortest distance between two points must be a straight line. People who spend much time in the mental world of the last century — whether they are distinguished professors or readers of novels — can become dizzied by it all. The discipline seems as uncomfortable, even oppressive, as the clothes that respectable ladies and gentlemen were then obliged to wear. Every newly enriched Cincinnati pork- packer and every Berlin iron-smelter felt obliged to take upon himself and his family the rigid etiquette that once had governed the lives of princes and popes.

This rigor in daily life was only the most mundane expression of the mood of certainty that suffused 19th-century culture. There were, inevitably, dissenters from this confidence even at the time. But most people, even most highly educated people, lived in a world in which right and wrong, beauty and ugliness, civility and incivility, excellence and failure were categories more certain, more readily understood, more distinct from one another, and more in accord with their ordinary intuitions than these categories are today.

This is what we lost in the First World War.

Paul Fussell has observed that “the Great War took place in what was, compared to ours, a world where the values appeared stable and where the meanings of abstractions seemed permanent and reliable. Everyone knew what Glory was, and what Honor meant. Hemingway could declare in A Farewell to Arms that ‘abstract words such as glory, honor, courage, or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the deaths.’ In the summer of 1914 no one would have understood what he was talking about.”

That was soon to change.

“In all my dreams, before my helpless sight,
He plunges at me, guttering, choking, drowning.
[. . .]
If you could hear, at every jolt, the blood
Come gargling from the froth-corrupted lungs,
Obscene as cancer, bitter as the cud
Of vile, incurable sores on innocent tongues,
My friend, you would not tell with such high zest
The old Lie: Dulce et decorum est
Pro patria mori.”

The Latin lines that Wilfred Owen used to conclude this, the most famous poem in our language about World War I, translate as “It is sweet and fitting to die for one’s
country.” They come from an ode of Horace’s that any educated Englishman of the last century would have learned in school. Scoffing at those words represented more than a rejection of war. It meant a rejection of the schools, the whole society, that had sent Owen to war: its bank presidents, bishops, and princesses fully as much as its generals. This theme of resentment — even hatred — of established authority pervades the retrospective judgments, literary and political, of the intellectuals of Europe and America upon the war. “Mr. Wilson’s war,” John Dos Passos angrily dubbed it in his great trilogy USA, as if the war had been inflicted on the United States by a single out-of-control politician.

The fiction that America had been lured into the war on false pretenses was a popular one in the 1920s and 1930s. Road to War, a 1935 bestseller by Walter Millis, an editorial writer for the New York Herald Tribune, pinned the blame squarely on Big Business and the House of Morgan: “The mighty stream of supplies flowed out and the corresponding stream of prosperity flowed in, and the United States was enmeshed more deeply than ever in the cause of Allied victory.” A congressional committee chaired by Sen. Gerald Nye argued instead that it was the arms manufacturers — the famous “merchants of death” — who were responsible for dragging America into an unwanted and unnecessary catastrophe. But all the mutually amplifying postwar critics of the American intervention in the war, however much they disagreed on the identities of the culprits, agreed at least — and persuaded the public — that the men and institutions responsible for the decision to
fight in 1917 stood irredeemably discredited.

If cruelty and misery can be measured, then we can truly say that human beings have done and suffered worse things than they did and suffered in 1917. But never have so many people died such painful and terrifying deaths because of the boundlessness of human stupidity as in that year. Because the leaders responsible for that stupidity — the kaisers and tsars, the generals and bishops, the journalists and professors — epitomized Europe’s traditional authority, the postwar revulsion against them naturally blended into a condemnation of traditional authority altogether. In Germany,
Russia, and Austria-Hungary, this revulsion acquired revolutionary force, toppling rulers and shattering empires. In the democratic West, there was no revolution to be made, and so this same mood of revulsion expressed itself as an omnidirectional cynicism.

And there was so much to be cynical about! By the spring of 1917, at least some of the Allied commanders — and quite a few of the German — seem to have begun to understand industrial war. At Vimy Ridge and Messines, British and Canadian troops won signal victories by advancing behind 500-foot-thick “creeping” barrages of artillery that moved ahead foot by foot to clear the ground ahead of them. They learned to maneuver in tiny squads, to attack at night, to halt at planned destinations and wait for the artillery in the rear to be moved forward to shelter them again. Inexcusably, the Allied High Command and the British and French political leadership would not absorb these lessons for another year. Instead, that spring and fall they led their trusting troops into two of the worst catastrophes of the whole war. In April 1917, the French — persuaded by General Robert Nivelle, one of those dapper imbeciles who have led that country’s soldiers to defeat after defeat over the past 125 years — gambled upon a ferocious but inept attack on thickly fortified German positions in the province of Champagne. In 48 hours of fighting, the French gained 600 yards…and lost 100,000 killed and wounded.

This calamity shattered the French army. Officers began to record incidents of disaffection and outright disobedience. In late May, some 30,000 troops mutinied, defying orders to go over the top and pile their corpses up in no- man’s-land. Dissension quickly spread to the rear. The despair that had felled the tsar appeared to be arriving in France. Nivelle was hastily sacked, and his replacement, Henri Petain, restored order by sentencing 400 mutineers to death (only 50 of the sentences were in the end carried out) and promising his troops better food and living conditions. The French army didn’t really recover, however, and it could be argued that it hasn’t recovered yet.

The cloth-headed British general Sir Douglas Haig, meanwhile, was planning what would soon prove an even more dreadful failure: the campaign known to the history books as “Third Ypres” but that is usually remembered as Passchendaele. More than any other battle, more than the Somme or Verdun, where the casualties were greater, it is Passchendaele that shapes and colors the English speaking world’s collective memory of the First World War.

Haig dreamed of breaking through the German lines at their northernmost point, in western Belgium, and then ordering his cavalry to charge through the gap and wheel around and behind the German army. He imagined a strategic flanking victory that would forever gild his name as one of history’s great commanders. The plan was foolish from the beginning. Even had Haig somehow “broken through,” three and a half years of war had churned the below-sea-level ground of Flanders, the western flange of the Rhine river delta, into a great soupbowl of mud. The ridges that protruded above sea level were soaked every fall by torrential rains. It wasn’t ground for a horse to charge over. It wasn’t ground a man carrying a 60-pound pack could walk over.

Basil Liddel Hart, in his classic 1930 history of the war, put into circulation a possibly invented anecdote. A “highly placed officer from General Headquarters was on his first visit to the battle front — at the end of the four months’ battle. Growing increasingly uneasy as the car approached the swamp-like edges of the battle area, he eventually burst into tears, crying, ‘Good God, did we really
send men to fight in that?’ To which his companion replied that the ground was far worse ahead.” The British writer Lyn Macdonald collected in the mid-1970s some 600 eyewitness accounts of the battle, which officially commenced on July 31 and sputtered out on November 10, and ultimately inflicted 250,000 dead and wounded on the British, Canadian, and Australian troops engaged in it. Her stories are the stuff of nightmares.

Listen to Major George Wade, an officer in the South Staffordshire Regiment.

Going up to the line for the first time my first indication of the horrors to come appeared as a small lump on the side of the duckboard. I glanced at it, as I went past, and I saw to my horror that it was a human hand gripping the side of the track — no trace of the owner, just a glimpse of muddy wrist and a piece of sleeve sticking out of the mud. After that there were bodies every few yards. Some lying face down on the mud; others showing by the expressions fixed on their faces the sort of effort they had made to get back on the track.

Sometimes you could actually see blood seeping up from underneath.

Or Sergeant John Berry of the Rifle Brigade:

We heard screaming coming from another crater a bit away. I went over to investigate with a couple of the lads. It was a big hole and there was a fellow of the 8th Suffolks in it up to his shoulders. So I said, ‘Get your rifles, one man in the middle to stretch them out, make a chain and let him get hold of it.’ But it was no use. It was too far to stretch, we couldn’t get any force on it, and the more we pulled and struggled the further he seemed to go down. He went down gradually. He kept begging us to shoot him. But we couldn’t shoot him. Who could shoot him? We stayed with him, watching him go down in the mud.

Or Private Miles of the Royal Fusiliers:

The moment you set off you felt that dreadful suction. It was forever pulling you down, and you could hear the sound of your feet coming out in a kind of ‘plop’ that seemed much louder at night when you were on your own. In a way, it was worse when the mud didn’t suck you down; when it yielded under your feet you knew that it was a body you were treading on. It was terrifying. You’d tread on one on the stomach, perhaps, and it would grunt all the air out of its body. It made your hair stand on end. The smell could make you vomit.

In the end, of course, the Allies won the war. Contrary to the view one often hears that the First World War was “about nothing,” the world is very much a better place because they did. Minus one genocidal demagogue, after all, the Germans’ war aims in 1914-18 were very similar to their aims in 1939- 45. But the Allied victory owed very little to the wanton slaughter on the battlefield and almost everything to the British and American naval blockade that ultimately did to the Germans what the Germans had hoped the submarine would do to Britain: starve them into surrender. Hundreds of thousands — perhaps millions — of Allied soldiers died wasted deaths because their generals were too blind to perceive the remorseless economics of “total war.”

The survivors of the war on either side of the trenches, and the widows and orphans of those who did not survive, would never again fully trust the political authorities who waged the war. With considerable justification. The newest account of the battle of Passchendaele, published last year by Robin Prior and Trevor Wilson, points out that the politicians who professed outrage at Haig’s recklessness had every opportunity to prevent and then call off his futile attacks. They preferred to avert their eyes. So “as the rain fell on Flanders and thousands of Haig’s soldiers prepared to struggle through the mud to their doom, the Prime Minister who was proclaiming the futility of this undertaking failed to raise a finger to stop it.”

When it was all over, the war turned out to have killed not just millions of young men. It killed, or left terminally wounded, the idea that deference to authority can have any legitimate role in a modern society. The habit of deference, of course, went on to outlive its justifications by another half century or so. Its final disappearance, in the convulsions of the 1960s and 1970s, is another story. But since 1918 the argument that democracies need self-confident intellectual and aesthetic elites has been a losing one.

Curiously, the more we make our peace with the necessity of economic winners and losers as the price of a dynamic free-market economy, the more we seem to object to any other form of inequality. Elites hold on to their positions by denying that they exist, by denying even their right to exist, by taking the lead in the destruction of what few norms and standards remain. And the ruin is not yet complete.

Somewhere under the soil of northern France, an enormous mass of explosives lies waiting. The British dug 21 great tunnels under the German lines on Messines Ridge, and in June 1917 they packed them full of dynamite and set them off. The sound of the detonation, the loudest until then ever made by man, could be heard on the English coast. Two of the mines failed to explode. One was located and neutralized in 1969. The other, reports the English historian Martin Gilbert, still rusts underground, inspiring a certain nervousness in the residents of the vicinity. Intellectually and morally too, the explosive material left behind by the First World War lies ticking beneath our feet.

Originally published by The Weekly Standard

Given the Circumstances Israel Shouldn’t Surrender More

David Frum April 12th, 1997 at 12:00 am Comments Off

Over the past three years, the Middle East peace process has proven itself all process, no peace. Last week, violence flared again on the West Bank. Stone- and Molotov cocktail-throwing youths attacked Israeli forces in the centre of Hebron. The Israelis, cornered, fired rubber bullets at dangerously close range. Three Palestinians were shot dead, two by troops, one by apparently trigger-happy Israeli civilians.

This bloody episode is unusual in one way only: the casualties were Palestinians. Most of those killed and maimed in the three years since Yitzhak Rabin and Yasser Arafat shook hands on the White House lawn have been Israelis: Israelis aboard buses in Tel Aviv, Israelis shopping at malls, Israelis driving their cars down highways. They have been blown to bits by suicide bombers, firebombed by guerrillas, attacked by angry mobs.

Nor can the terror against Israel be blamed entirely on a handful of extremists. Barton Gellman, the Washington Posts first-rate Middle East correspondent, last week reported a new survey by the
Nablus-based Centre for Palestinian Research and Studies. It found broad support for terrorism in the general Palestinian population. Fifty-two per cent of Palestinians believe there can be no acceptable compromise between the Israelis and themselves. Thirty-eight per cent favor further terrorist attacks on Israeli civilians. Thirty per cent are ready for intefadeh-like attacks on Israeli troops. And the popular clamor for war is increasingly influencing the Palestinian elite: Gellman reports that two important Arafat lieutenants who have backed the peace strategy, Ahmed Adik and Marwan Barghouti, now advocate a return to violent confrontation.

Canadians who get their information about the Middle East from their newspapers or the CBC may think they know why the Palestinians are again taking up the bomb and the gun: it’s because the
Palestinians — according to the Canadian media — are angry that Israel has not evacuated their land. What most Canadian Middle East news omits, however, is the glaring fact pointed out by Charles Krauthammer in a brilliant article in the April 14 issue of the Weekly Standard: the Israelis have evacuated the Palestinians’ land.


Of the 2.3 million Palestinians living in the
West Bank and Gaza, 2.25 million — 98% — live on land ruled by the Palestinian Authority. As Krauthammer observes, “[T]he violence you see on your TV screen is not the work of an unjustly occupied people wanting to be free. It is the work of an already freed people trying to storm demarcation lines solemnly established by their own leadership to separate their territory from Israel’s.”

Krauthammer is right. The Palestinians killed in Hebron last week were shot only after the mob had crossed the boundary line between the Jewish and Arab areas of the city — a line established by the previous round of negotations — on their way to Jewish residential quarters.

Some will (again) choose to blame the Hebron confrontation on Israel, and specifically on Prime Minister Benjamin Netanyahu’s decision to order construction of a new suburb in east Jerusalem.
But what sort of an explanation is that? Under Oslo, Israel surrendered large stretches of territory in return for one thing only: a promise by the Palestinians to stop using violence and terrorism, and to seek peaceful resolutions of their grievances. And yet, at the very first grievance to come along, the Palestinians — incited by Arafat himself — utterly disregard their sworn obligations, and immediately resort to violence.

Israel struck a deal with the Palestinians: evacuate the main Palestinian population centres in exchange for peace, with other questions — including the status of Jerusalem — postponed until later. Israel has scrupulously handed over every stipulated acre of ground. In return, however, Israel has received not the promised peace, but violence ever closer to home. More ominously still, the Arab states surrounding Israel are also turning to confrontation. At the end of March, the Arab League voted in Cairo to restore its economic boycott of Israel.

“Oslo,” Krauthmamer concludes, “is a process in which Israel is rewarded with temporary gifts for its process of retreat. So long as the retreat is in progress, everyone is happy. But if the retreat even pauses, all hell breaks loose.”

Why would any sane Israeli want to surrender anything more under such circumstances?

Originally published in The Financial Post

What’s Wrong with this Plan?

David Frum April 2nd, 1997 at 12:00 am Comments Off

For conservatives, the performance of the Republican Congress is rapidly deteriorating from the depressing to the embarrassing. Each week we wonder, can things get worse? And the answer always seems to be, yes, they can.

Now Senator Orrin Hatch of Utah has unveiled a plan to give the government an even greater role in health care. Senator Hatch and his Democratic co-sponsor, Senator Edward Kennedy of Massachusetts, propose to give states up to $5 billion a year to subsidize health coverage for children who lack insurance but whose parents earn too much to qualify for Medicaid. The Senators would raise the money by increasing the Federal tax on cigarettes by 43 cents a pack.

Politically, the plan is perfect — Dick Morris himself couldn’t have triangulated anything more clever. It goes after everybody’s favorite bad guy — the cigarette industry — and redistributes the money to the most photogenic beneficiaries possible. But the plan begs a follow-up question: If Federal spending is the key to providing coverage for uninsured children, why hasn’t the problem been solved already?

After all, back in 1986 Congress greatly enlarged the reach of Medicaid by permitting states to use Federal dollars to provide health coverage to all children under 19 they considered poor, even those whose parents didn’t receive welfare. Then in 1990, Congress took a further step and made it mandatory that states provide Medicaid coverage to all poor children by the year 2002.

What has been the result of all this largess? In 1986, Medicaid consumed a little more than $27 billion a year. In 1997, the program cost $105 billion, and by 2002, when the mandate to extend Medicaid is to be fully imposed, it will cost more than $133 billion. Yet despite this colossal tide of money, the number of uninsured youngsters under 19 remains enormous: an estimated 10 million (as compared with 7.8 million in the late 80′s).

So Senators Hatch and Kennedy are now suggesting that the Government spend even more: Their program would subsidize even families earning up to 185 percent of the official poverty level, or nearly $20,000 a year.

Some might say, well, why not insure all the kids we can? Because the Hatch-Kennedy plan would not be as helpful as it seems.

First, few people really believe that insurance is a children’s issue. The real casualties of the breakdown in the insurance market aren’t 15-year-olds — they tend not to need much medical care in the first place — but older people, like the 50-year-old salesman who was downsized a year ago and is now working part time without benefits. Health-care reforms that ignore the middle-aged in favor of children are sentimental, not sensible.

Second, the Hatch-Kennedy proposal has a perverse incentive for employers: It would encourage them to quit providing health benefits to their workers. Employers pay the cost of health benefits, for the most part, not because they’re kind-hearted, but because they couldn’t attract workers otherwise. When unscrupulous employers of low-wage labor hear that Washington is volunteering to take responsibility for every working parent’s biggest worry — his or her children — they will be tempted to offload that cost.

Third, the financing plan for Hatch-Kennedy is extremely wobbly. The $5 billion it would give the states wouldn’t insure very many kids — probably fewer than half of those now without coverage. And that’s before employers accept Hatch-Kennedy’s invitation to reduce coverage for their workers’ children.

Fourth, nobody can truthfully promise to bring the number of uninsured children down to anywhere near zero through a Government program. Too many of the uninsured young — at least one million of them and possibly twice that number — are the children of illegal residents and therefore are ineligible for any Federal benefit.

The sad truth is, the failures of the American health-care system are what doctors call iatrogenic: a problem caused by failed attempts to cure it. Yet members of Congress who complain that insurance is unaffordable ought to remember the enormous new costs they themselves have imposed on providers. For example, it’s now illegal to sell a health-insurance policy that does not include mental health benefits and a minimum 48-hour hospital stay for new mothers.

Yes, mental health coverage and 48-hour maternity stays are desirable things. But it makes no sense to require that every plan be as loaded down with features as a brand-new Cadillac if that means that millions of Americans are locked out of the health-insurance market as a result.

What America needs from Washington is not more regulation followed by more subsidies to compensate for those regulations. Rather, the Government should allow insurers and other health-care providers to sell basic coverage that lower- and middle-income earners can afford on their own. And it should give the self-employed and others who buy their own insurance equivalent tax benefits to those given businesses who provide coverage for their employees.

Americans should have a choice among comprehensive health plans, but the 38 million uninsured — adults and children — need a stripped-down plan that’s adequate and affordable: a Model T for health insurance. This idea may not poll well with focus groups. But over the long term it would do more good than any policy gimmick that’s come out of Washington so far.

Originally published in the Financial Post.