Thursday, May 31, 2007
"Americans need to know the history of American anti-communism if they are to understand the great role they have played in ridding the world of the most murderous of the twentieth century totalitarians." -- Richard Gid Powers 1
On October 16, 1961, thousands of people packed the Hollywood Bowl. The occasion was not a rock concert or a sporting event but the biggest anticommunist rally in the country. "Hollywood's Answer to Communism" was carried on nationwide television. Actor George Murphy was the master of ceremonies and other speakers included Herb Philbrick, Congressman Walter Judd, Dr. Fred Schwarz, Senator Thomas Dodd, and my uncle, W. Cleon Skousen, a former special assistant to I Edgar Hoover and author of the bestseller The Naked Communist.
I was in my early teens when the anticommunist movement was at its zenith and remember seeing my uncle on TV I watched shows like I Led Three Lives and read books like John Stormer's None Dare Call It Treason, J Edgar Hoover's Masters of Deceit, and Whittaker Chambers's Witness.
But despite this grounds-well of concern over the threat of communism, communist sympathizers at high levels combined with media forces to ridicule and vilify patriotic conservatives. Most historians deplored the anticommunist movement of the 1950s and 1960s as "extremist," "paranoid," "right-wing" hysteria. Accordingly, there was little credence given to this alleged vast communist conspiracy; reaction went rarely beyond references to McCarthyism, red-baiting, and blacklisting. They challenged the anti-communists' claims that the Soviets had planted numerous agents in government, that Stalin had infiltrated the film industry as a means of promoting communist propaganda, that the Communist Party USA was a pawn of Moscow, and that the Soviet Union was a serious military threat.
They depicted the anticommunist era as an unwarranted "witch hunt" against liberal progressives and idealistic movie stars and a groundless attack on patriotic government officials who they say were falsely accused of espionage. They carried on a 40-year campaign to prove Alger Hiss and Julius and Ethel Rosenberg innocent. My uncle's book so angered members of the political science and history departments at Brigham Young University that Richard D. Poll, a history professor, wrote a scathing critique of his "extremist" views on Karl Marx and communism.
Mises and Socialism
In those days, the economics profession also east doubt on free-market criticisms of socialism and the Soviet economy. Half a century earlier, Ludwig von Mises and F. A. Hayek were lone voices in charging that socialist central planning could not work. According to conventional wisdom, Mises and Hayek had lost the debate with the socialists in the 1930s, and in 1985 Paul Samuelson reported in his popular textbook that the Soviet Union had grown faster than any other industrial economy since the 1920s. As late as 1989, Samuelson claimed that "The Soviet economy is proof that, contrary to what many skeptics had earlier believed, a socialist command economy can function and even thrive."2
But then, following the collapse of the Berlin Wall and Soviet communism in 1989-90, economist Robert Heilbroner shocked his colleagues in the socialist world by boldly declaring that the long-standing debate between capitalism and socialism was over. "Capitalism has won, " he confessed. "Socialism has been a great tragedy this century." Furthermore, Heilbroner was forced to change his mind about Mises and the debate over socialism. Following the unexpected collapse of communism, Heilbroner admitted, "It turns out, of course, that Mises was right."3 And it wasn't long before Paul Samuelson did an about-face in his textbook, labeling Soviet central planning "the failed model."
Revelations from the Soviet Archives
The fall of the Soviet Union brought about another dramatic outcome that would have far-reaching effects on modem history. The Russian government opened up thousands of secret KGB files in Moscow, revealing what one historian called "stunning revelations" about espionage and the Soviet economy under Stalin. This new information has sparked a harsh reevaluation of the anticommunist movement by historians and the media. As one reviewer put it, "It's like looking into the new edition of a book from which half the pages had previously been torn out.''4
The KGB files prove beyond doubt that Alger Hiss, the Rosenbergs, and numerous other Americans accused of spying for the Soviets were guilty. They confirm what J. Edgar Hoover and the House Un-American Activities Committee were saying all along: that spies reached the highest levels of the State and Treasury departments, the White House, and the Manhattan Project, and that the Communist Party USA (which had 50,000 members in World War 11) got its marching orders from Moscow.5
Stalin's Economic Disaster
Based on research at the Soviet archives, historian Sheila Fitzpatrick has written a pioneering account of everyday Russian life in the 1930s: "With the abolition of the market, shortages of food, clothing, and all kinds of consumer goods became endemic. As peasants fled the collectivized villages, major cities were soon in the grip of an acute housing crisis, with families jammed for decades in tiny single rooms in communal apartments. ... It was a world of privation, overcrowding, endless queues, and broken families, in which the regime's promises of future socialist abundance rang hollow... Government bureaucracy often turned everyday life into a nightmare."6 What a sharp contrast to Samuelson's glowing account of the Soviet economy.
After writing three books on the Soviet archives, historians John Earl Haynes and Harvey Klehr summed it up this way about the anti-communists: "They were right." And being right, they deserve our praise and gratitude.
1. Richard Gid Powers, Not Without Honor. The History of American Anti-communism (New York: Free Press, 1995), p. 428.
2. Paul A. Samuelson and William D. Nordhaus, Economics, 13th ed. (New York: McGraw-Hill, 1989), p. 837.
3. Robert Heilbroner, "Reflections After Communism," The New Yorker, September 10, 1990, and "The Triumph of Capitalism," The New Yorker, January 23, 1989.
4. Joseph E. Persico, "The Kremlin Connection," review of The Haunted Wood: Soviet Espionage in America, by Allen Weinstein and Alexander Vassiliev, New York Times Book Review, January 3, 1999.
5. Several books have been published detailing new findings from the Russian archives, including John Earl Haynes and Harvey Klehr's Venona: Decoding Soviet Espionage in America (New Haven: Yale University Press, 1999) and The Soviet World of American Communism (New Haven: Yale, 1998).
6. Sheila Fitzpatrick, Everyday Staliinism (Oxford University Press, 1999), flyleaf
At the time of the original publication, Mark Skousen was an economist at Rollins College, Department of Economics, Winter Park, FL 32789, a Forbes columnist, and editor of Forecasts & Strategies. His just-completed textbook, Economic Logic, is available from FEE.
Wednesday, May 30, 2007
Do you agree with Maturin that Napoleon was a tyrant, an unmitigated disaster? Buonaparte did, after all, reform French law, politics, taxes—doesn't much of his code civil endure today? He also restored French national identity after the Revolution and the Terror. Wellington and Metternich, not to mention Louis XVIII and Charles X, were unreconstructed reactionaries. Yet Maturin sees Napoleon as a kind of early Stalin.
Yes, I do agree with Maturin. I think Buonaparte did France—a country that he hated as a youth—very great harm indeed, not only because he brought about the death of vast numbers of Frenchmen, far more than even Louis XIV, but because he left the country with a curiously vulgar notion of glory, which Louis did not. I do not think he restored French national identity at all, but superimposed upon it a trashy chauvinism that is still sadly active, particularly in the army. One cannot blame him entirely for the miserable decline in music, painting, architecture and furniture-making that coincided with his altogether regrettable existence, for zeitgeist had a great deal to answer for; but there is no doubt that he was devoid of taste (he admired Ossian) and his manners were as indifferent as his French. His utterly unscrupulous rapacity in Italy, Switzerland, Malta and Spain, to say nothing of his treatment of the Pope, may not quite qualify him as a rival to Stalin, but it seems to me quite enough to justify Maturin's opinion.
As for the Code Napoleon, I am not scholar enough to know how much Buonaparte had to do with it, but from what I have seen of the system, or of what remains of it, I do not think it reflects much credit on the authors. It is shockingly authoritarian and misogynistic; and since according to its provisions all the children have equal rights in their parents' property it has a disruptive influence on both family life and the cohesion of an estate. I have often seen the miserable results of this among our friends in the remote provincial corner of France where we live and where many people still depend entirely on the land. The children soon learn—it is a matter of common knowledge—that apart from the small proportion that can be left according to the wishes of the leaver the whole of the rest is theirs as a certain, wholly dependable legacy, however badly they may behave. I will not say that the prospect of being cut off with a shilling in the traditional English way necessarily turns all born under that law into models of filial piety, but I believe it has some effect. And in passing I may observe that parricide is back-page news in this neck of the woods.
As for the rest of the code that is associated with Buonaparte's name, it is so slow, and often so harsh to the accused, that one might almost prefer the English jungle, which does at least preserve some ancient customary law: though indeed Isaiah dismisses all human systems in a line that the Vulgate renders et quasi pannum menstruate universae justitiae nostrae and the Douay Version all our justices as the rag of a menstruous woman.
(Excerpt from Paris Review, Issue 135, Summer, 1995 )
Tuesday, May 29, 2007
Among nation-wide media outlets such as the six channels providing news programming and America's biggest daily newspapers, just seven percent of working journalists and executives identify themselves as conservatives compared to 33 percent of the general public, according to a study conducted by the Pew Research Center for People and the Press.
Five times as many national journalists (34%) call themselves liberal, which is an increase from 22 percent who identified themselves as such in a survey the group conducted in 1995.
There is a lot of interesting and compelling data in the Pew Center's poll and it is well worth reading if you have the time, but the fact that it relies upon the sample's own assessment of itself raises questions about its accuracy.
In the field of polling, self-diagnosis is widely panned as an accurate measure of ideological identification, largely because most people do not have a good understanding of the ideological spectrum. This is especially true among people who classify themselves as moderates, which is what most journalists claim to be.
Over the years, most pollsters have come to the conclusion that there are very few actual moderates and that most people who claim to be centrists are usually anything but. Research has also shown that among the general public, liberals are more likely to classify themselves as moderate than conservatives.
Although many surveys have shown this, a good example is the 2000 election: while only 20 percent of the public identifies itself as liberal, Democratic presidential candidate Al Gore earned 48 percent of the popular vote. Thus, Gore's share of the "moderate" vote had to have been bigger than Republican George Bush's who was operating from a larger base of 33 percent.
This is not to say that self-identification is never used among pollsters, but rather that it is only considered useful to gauge a sample's view of itself and then compare it with the sample's actual views (which can be readily ascertained through single-issue questioning).
When this line of reasoning is applied to the journalists surveyed in the Pew Center poll, it is apparent that the phenomenon of liberal misidentification is very much present in the American press.
Compared to nine years ago, national journalists are much more of the opinion that the press is playing it soft on President Bush. Fifty-five percent say that the press is "not critical enough" of the president; just nine percent believe the media is being "too critical." By comparison, in 1995, only two percent thought the press gave too much coverage to the achievements of President Clinton. Forty-eight percent believed the media didn't cover them enough.
On moral issues, it is evident that despite the fact that 54 percent of journalists claim to be moderate, they are considerably more liberal than the public. Fifty-eight percent of Americans believe that it is necessary to believe in God in order to be a moral person. Just six percent of surveyed journalists believe this. Asked about whether homosexuality "should be accepted by society," 88 percent of journalists agreed, compared to 51 percent of Americans. Among journalists describing themselves as moderate, 84 percent were of this opinion.
On the subject of news bias, journalists were much more sensitive to conservative bias. When asked if they could name a national news organization that struck them as "especially liberal," 62 percent could not do so. Among those who could name a liberal news organization, 20 percent cited the New York Times.
By contrast, 82 percent were able to list a news organization that they felt was "especially conservative." Among this group, the vast majority (69 percent) cited Fox News Channel as the embodiment of conservative media bias.
While this poll did not ask the general public about its opinion of bias, most polls show that the public is much more likely to perceive a liberal slant in the news. In a survey conducted last year by the Gallup Organization, 45 percent of the respondents said that the press was "too liberal" while just 15 percent felt it was "too conservative."
Although he likely was not surveyed by the Pew Center, Dan Rather's attitudes toward himself and the press are strikingly similar to the national journalists who were polled. Like many of them, he insists that he is a moderate and is seemingly oblivious to the idea that the press is dominated by liberals who often inject their opinions into their stories.
Over the years, Rather has persistently denied that he is a liberal, despite his long record of favoring Democrats and liberals over Republicans and conservatives.
In 1999, when asked by CNN's Bill Press if he was a liberal, Rather rejected the idea out of hand saying that " If I were, I would say so and I would be proud of it, but I'm not."
Asked about his opinion of claims that liberals dominate the news media and as a result bias the news to fit their views, the anchor told late night talk show host Tom Snyder that such claims were groundless:
"It's one of the great political myths, about press bias. Most reporters are interested in a story. Most reporters don't know whether they're Republican or Democrat, and vote every which way. Now, a lot of politicians would like you to believe otherwise, but that's the truth of the matter. I've worked around journalism all of my life, Tom Snyder has as well, and I think he'll agree with this, that most reporters, when you get to know them, would fall in the general category of kind of common-sense moderates.
Rather even shares most national journalists' opinion of the New York Times, as former CBS correspondent Bernard Goldberg recounted in a May 24, 2001 op-ed for the Wall Street Journal:
"In 1996 after I wrote about liberal bias on this very page, Dan was furious and during a phone conversation he indicated that picking the Wall Street Journal to air my views was especially appalling given the conservative views of the paper's editorial page. 'What do you consider the New York Times?' I asked him, since he had written op-eds for that paper. 'Middle of the road,' he said.
"I couldn't believe he was serious. The Times is a newspaper that has taken the liberal side of every important social issue of our time, which is fine with me. But if you see the New York Times editorial page as middle of the road, one thing is clear: You don't have a clue."
Other takes on the Pew Center's poll of journalists:
Read the center's summary of its findings here. For its methodology, see here. See also its press release
The Washington Post's Howard Kurtz quotes study director Tom Rosenstiel: "This is something journalists should worry about," he says. "Maybe diversity in the newsroom needs to mean more than ethnic and gender diversity." See also his online chat about the survey.
John Hinderaker noticed that the number of reporters concerned about unfairness in stories declined from 12 percent in 1999 to 5 percent in 2004.
Former newspaperman Mike Gordon on why journos see things differently than the public: "Journalists live in a world dominated by government, and they reflexively see government action as the default way to approach any problem."
Editor and Publisher notes that the number of self-described liberals increased dramatically from 1995.
San Diego Union-Tribune designer Matthew Hoy argues: "Most journalists look around the newsroom and can point out the Vietnam-era protester/Marxist/Leninist and say to themselves: 'Well, that guy's more liberal than I am.' Then they watch television and they see Tom DeLay or Jesse Helms and think: 'I'm definitely not a conservative.' Therefore, the only thing left is to label oneself a moderate."
James Joyner wonders if journos think they're too easy on Bush how would a "too tough" media cover the prez?
WSJ's James Taranto: "All this suggests that journalists not only are considerably more liberal than the general public but also wish their own coverage were more liberal than it is. No wonder public confidence in the press is suffering."
Scott Wrightson believes reporters call themselves moderates to avoid giving ammunition to supporters of the liberal bias theory.
Asian wire service Press Trust of India takes a different tack on the story in a piece headlined "Journalists have very low self-esteem: Survey."
USA Today completely ignores Pew's findings on bias, focuses on everything but. It does find room to promote a study by liberal media watchdog group accusing NPR of conservative bias, though.
Dean Esmay: "All this tells me one thing: Blogs are more representative of the general population than the professional press is. Which should be no surprise. My guess is that an awful lot of people who lean right will tend to be people who choose careers other than journalism--but for whom blogging is a perfect outlet."
(From May 25, 2004)
Sunday, May 27, 2007
THE Tories spent the week beating themselves up and wondering why people called them the Nasty Party. Their big problem is that they've allowed Labour and the fascist Left to get away with portraying them as little better than the Nazi Party. They're so frightened of their own shadows that they've put up with the most appalling slander and libel.
The fact is that the real Nasties are all to be found on the Left. Most of the spite and class hatred comes not from Conservatives but from Labour, whether in the vindictive campaign against the countryside or the war on motorists. Or by deliberately trying to prevent private school pupils from getting to the best universities, regardless of how well they did in their exams. The siting of asylum seeker camps in Tory constituencies is another example.
I can't ever remember, in 18 years of Conservative rule, the Tories ever wanting to eliminate all Labour opposition. But Labour wants to drive the Tories into the sea, just as the Arabs want to wipe the Jews off the face of the earth.
I know from my experience as a columnist that if you attack any vaguely Conservative cause, the worst you can expect is a bit of grumbling and a few angry letters. But turn against the Left on Palestinian homicide bombers, illegal immigration or fox-hunting and the hate mail and death threats have to be read to be believed.
Challenge the smug New Labour/Guardianista axis and expect a barrage of lies, smears and character assassination in return.
The Conservatives have allowed themselves to be bullied into swallowing the Left's agenda. It's not that the Tories are the Nasty Party. It's that they're the Bloody Useless Party.
From: "Sun Online", Saturday, Oct 12th, 2002
Saturday, May 26, 2007
Kerry began by highlighting the strong points of the presidential election in 2004 and stated a strong confidence in winning the Senate elections in 2006. "I won 10 million more votes than any Democratic presidential nominee ever," he said in regards to the 2004 presidential election versus current President George W. Bush. He was met with applause when he mentioned, "It's a sad time for our country," in reference to President Bush's policies.
Kerry believes other nations depend on the United States to make a difference and that under President Bush's control, other countries do not have that support. Kerry hopes a Democratic majority in the U.S. Senate will help the country to "make sense of the despair and frustration" the nation now has. He encouraged young Democrats to convince more of their friends to fight for the Democratic cause, and to also involve younger teenagers so that they are more aware of the issues when it comes time for them to go to the voting booths. He remained confident that there will be a Democratic majority in the Senate in 2006.
Kerry said "reality and truth are values that define us as a nation" and that the country needs truth now more than ever before. He referenced the war in Iraq as the main truth the American people need to hear. "The presence of troops in Iraq are part of the problem. Success will not depend on how long we stay, the Iraqis will say 'we're going to let them stay as long as we want," said Kerry. "I think that's wrong. It's time for Iraqis to police Iraqis, it's time for Iraqi's to stand up to Iraq."
Kerry then emphasized education, saying that "53 percent of high school students in the United States do not graduate high school." China and India are racing further ahead in education and technology fields and that this calls for a new national educational plan, he said.
Kerry says he wants to fix intelligence in the country; he wants good foreign policy and believes America needs to be less dependant on Saudi oil.
During the question and answer session, an Emerson College student asked Kerry what main party initiatives the Democratic grassroots organization should be focusing on. The five issues Kerry stated were energy, healthcare, fiscal spending, educational funding and homeland security. All of these, Kerry said, the student Democrats should be aware of and voice their awareness to other young citizens.
He also shared encouraging words to the grassroots volunteers. "I've seen the civil rights movement, the women's and environmental movements, and it was grassroots that succeeded in changing things," said Kerry. Kerry concluded by giving the students a little lesson in politics. "All politics is a reaction to felt needs. You need to get people to feel the need. Our job is to make sure the right felt need is taken into consideration."
(From November 07, 2005 )
Friday, May 25, 2007
The intellectual distance the Western world has traversed over the past two generations in how we think about markets, the state, and economic policy is nowhere better illustrated than in the changing reputation of the Austrian economist Friedrich A. Hayek (1899–1992). In the decade after publication of Hayek’s tract The Road to Serfdom (1944), in which he argued that expansion of the European welfare state was of a piece with spreading totalitarianism, he was regarded as little more than a right-wing crank, a provocateur who dressed up his own normative preferences for markets and individual freedom in the language of science. Today, by contrast, Hayek wears a richly deserved mantle of intellectual respectability. Winner of the Nobel Prize in economics in 1974, he is rightly seen as the intellectual godfather of the pro-market revolution that swept the West with Margaret Thatcher and Ronald Reagan. He has spawned an enormous following that extends well beyond the social sciences.
And yet, even those who claim to admire Hayek rarely understand that many of his most important ideas are critical not just of state intervention and planning as practiced by the Left, but of dominant currents in contemporary neoclassical economics as championed by the Right. Bruce Caldwell’s impressive new biography pulls together these themes and shows how the second critique logically grows out of the first.
All the threads in Hayek’s thought came together in the so-called socialist calculation debate of the late 1930s, in which he and other Austrian school economists challenged the view that centralized planning would yield greater economic growth. In such works as “Economics and Knowledge” and “The Use of Knowledge in Society,” Hayek’s critique of socialism was, at its core, empirical rather than normative. He argued that human knowledge is inevitably partial: There are limits to rationality, and what any individual knows tends to be local in nature. This is particularly true in a macroeconomy, which depends on the interactions of thousands, even millions, of individual producers and consumers.
The problem with socialism, Hayek argued, is that it seeks to replace the dispersed knowledge of those myriad actors with that of a single, omniscient planner. Socialist central planning cannot work because it attempts the impossible: using a static equilibrium model to capture unfathomably complex inputs and outputs characterized by dynamic, constantly shifting equilibria. In market economies, by contrast, the price mechanism provides information about preferences and relative scarcities to thousands of agents, whose continual exchanges produce a socially beneficial if unplanned outcome.
At the time of the socialist calculation debate, the Soviet economy was growing rapidly and the capitalist West was reeling from the Great Depression, leading many to consider socialism the superior system. Empirical validation of the Hayek thesis would have to await later decades, when centrally planned economies began to display huge dysfunctions arising from precisely the kinds of informational problems he had outlined. Today, virtually no one believes that the coordinating function of the price mechanism in a free market can be replaced by central planners using even the most powerful supercomputers. And we are much more likely to accept Hayek’s broader insight that social order—not simply markets but morality, social norms, the rule of law, and the like—is often the spontaneous and unplanned consequence of the interactions of dispersed individuals with limited knowledge, not the work of a single designer.
But Hayek also offered a far more profound critique of the limits of human reason, which extended to the models that would come to underlie postwar American neoclassical economics and, thus, the economics that we teach university students to this day. Caldwell explains that a constant theme in Hayek’s writing—from his early critique of “scientism” in his “Abuse of Reason” project to his last published work, The Fatal Conceit (1988)—is a critique not just of real-world planners but of positivist social scientists who aim to turn the study of human behavior into something as empirical and predictive as the physical sciences.
Like contemporary neoclassical economists, Hayek was a “methodological individualist” who believed that the behavior of groups needs to be explained in terms of the interactions of the individuals who make up the collectivity. But his view of individual choice was far more nuanced and complex than the typical neoclassical model of economic man. He understood that individuals are neither omniscient nor fully rational and are constrained by institutions, norms, and traditions that can be understood only through a study of history.
As Caldwell notes, Hayek initially thought the dividing line between possible and impossible positivism lay in the distinction between natural sciences and social sciences, but by the 1950s he had come to understand that the issue was really one of complexity. A positivist, predictive science is possible only for phenomena, whether human or natural, that are relatively simple—particle physics, for example. One can never fully model and predict complex phenomena such as the spontaneous orders produced by the interactions of simpler agents. These orders include the human brain, whose higher functions cannot possibly be inferred from its physical substratum, as well as ecosystems and, of course, markets, cultures, and other human institutions.
Hayek, in other words, fully anticipated the rise of what we now know as the study of complex adaptive systems, or complexity science. Drawing much of its inspiration from evolutionary biology, this approach is today practiced in such places as the Santa Fe Institute, a multidisciplinary think tank that uses agent-based simulations to model the emergence of complex behaviors on the part of larger collectivities. But Hayek would doubtless disapprove of the research agenda in much of the complexity field, which seeks to use these models to produce deterministic, predictive outcomes.
One of the most interesting parts of Caldwell’s book is the epilogue, which quotes Hayek toward the end of his life as saying he regretted his failure to return to his critique of Milton Friedman’s Essays in Positive Economics (1953) as much as his failure to revisit his critique of John Maynard Keynes. Hayek’s critique had not to do, of course, with Friedman’s preference for markets and limited government, but rather with his belief that economics could be turned into a rigorously empirical and predictive science. Caldwell notes that while econometric methodology has become far more sophisticated, and game-theoretic models ever more complex, economics’ promise to cumulate knowledge about universal laws of human behavior has remained largely unfulfilled. Thus, the highly mathematical and ahistorical turn that academic economics has taken in recent years would have been, for Hayek, as much an abuse of reason as the socialist planning of earlier generations.
Hayek’s Challenge is, as its subtitle implies, a purely intellectual biography that seeks to interpret the body of Hayek’s written work. One finds virtually no details of Hayek’s personal life—why he divorced his wife, or how he reacted to being awarded the Nobel Prize alongside the leftist Gunnar Myrdal. Instead, the book begins with a lengthy and informative intellectual history of Austrian economics, touching on such issues as the debate between Carl Menger and Gustav Schmoller of the German historical school. This exposition is critical to understanding the intellectual milieu in which Hayek studied, as well as interesting in itself because it anticipates the controversies that continue to divide contemporary positivist social science from more historical and ethnographic approaches to understanding things human.
Caldwell, an economic historian at the University of North Carolina at Greensboro, ends his book by plaintively noting that the un-Hayekian agenda of turning economics into a rigorous science has driven all other approaches, including the study of economic history, out of American economics departments. But the damage done by this positivist approach is, in fact, much greater. Economic methodology has colonized political science too, eliminating individuals with knowledge of real peoples, cultures, and history—for example, experts on the Middle East—from the country’s top schools. We are thus presented with a rather depressing picture of human progress. Although the particular brand of intellectual hubris that elevated central planning over markets is gone, other forms persist, and indeed have grown stronger. Hayek’s challenge remains an open one.
(From May, 2004)
Wednesday, May 23, 2007
Democrat Dick Harpootlian and Republican Henry McMaster made their comments at The Citadel on June 11 to about 800 high school seniors participating in the annual leadership event, The (Columbia) State reported Saturday. Harpootlian was quoted in the Boys State newspaper, The Citizen Times, talking about how state lottery funds could give qualified high school graduates $4,500 for college. "That can buy a lot of beer and girls," he said. McMaster followed by saying "Democrats are for beer and girls. Republicans are for cold beer and hot girls," the newspaper said.
Also at the Boys State gathering, Harpootlian said the parties disagree in principles, but "agree on beer and girls." Only days earlier on June 6, a memo from the fictitious "Men's Caucus" responding to a real memo from the House Women's Caucus reminding pages about professional work attire circulated among legislators, saying women pages could get bonuses for wearing tops with less material. The fake directive also said underwear was optional and skirts should be no longer than 4 inches above the knee. No author to the fake memo has come forward.
Gov. Jim Hodges has asked the state Human Affairs Commission to investigate. The federal Equal Employment Opportunity Commission has begun a preliminary inquiry. Rep. Vida Miller, D-Pawleys Island, asked for the House Ethics Committee to investigate the anonymous memo. House Speaker David Wilkins, R-Greenville, has pledged to find those involved. He has sent letters to all pages and their parents among other corrective actions ensuring a safe work environment.
Rep. Edie Rodgers, R-Beaufort, heads the Women's Caucus and said she was disappointed Harpootlian and McMaster would joke like that. "I'm sort of up to here with flippant comments from men at this point," Rodgers said. Harpootlian said "anyone that's perturbed at this, I'd say, 'Get a life.'"
McMaster said he used the humorous remark to grab the teen's attention before discussing issues like gun control and education. "In hindsight, I think Dick and I both got a little carried away," McMaster said. "But the stated purpose was to say, "Let's forget about all this (beer and girls) and get serious.'"
Boys' State is a leadership-development program sponsored each year by the South Carolina American Legion. Attendees elect and run their own mock government and learn about public service. Tony Papadopoulos, a 17-year-old Boys' State participant from Mount Pleasant, said he thought it was "creative how (the chairmen) captured the audience's attention."
The phony "Men's Caucus" memo earlier this month began a debate on the place of women in South Carolina politics. The state ranks 50th in women holding statewide office, says the Institute for Women's Policy Research in Washington, D. C. "Too bad the leaders of the two major political parties in the state didn't use the occasion to elevate the discourse about politics," said Laura Woliver, associate director of women's studies at the University of South Carolina. "I can't believe they said this after the 'Men's Caucus' stuff," she said. "They just don't want to change."
Boys' State director Allen Bosworth said Harpootlian and McMaster are "provocative guys." But Bosworth said the comments were probably inappropriate and he might speak to the chairmen about them. Harpootlian said this to Democratic women who may be offended: "Don't go to Boys' State. If you see I'm speaking, definitely don't go to Boys' State."
June 24, 2001
Monday, May 21, 2007
It runs in the family: Before Nicole, her only sibling, Andrew, was the youngest student at the college. Now 14, he is a senior. Nicole said she can't imagine what it would be like to be in a regular classroom with other 12-year-olds. "Home schooling was a big advantage because you can go at your own pace," she said. Tan's legs aren't long enough to touch the floor when she sits back in her chair. Dressed in a small UC Davis shirt featuring a surfing Snoopy, the shy preteen doesn't look intimidating, but she will likely throw off a few test curves.
Tan passed the state high school proficiency exam three years ago and has since taken enough courses at a Pittsburg, California, community college to make her a junior in college. Accelerated home schooling allowed her to skip some dreaded teen-age experiences: junior high and the SAT college entrance exams.
She declined to provide any information about her parents, who declined to be interviewed. The family has moved into an on-campus apartment. "I play with other children my age," Tan said. "I don't study a fixed amount. Sometimes I study all day and sometimes not at all."
University administrators admit they had some concerns about enrolling a 12-year-old, but say Andrew Tan's success at the university convinced them. "We love to have young scholars here," said admissions director Gary Tudor. "We are paying high attention to her well-being. But she has earned the right to be here and we are pleased to give her the opportunity of some accelerated learning."
Davis students say the young student should also try to squeeze in other college activities. "A big part of college is finding out what kind of person you are and you can't get that just by studying," freshman Lisa Robbins said. Nicole said she probably won't go to football games, but wants to hang out with her classmates. She might even help them with their homework. "If they ask, probably," she said.
From September 29, 2000
Sunday, May 20, 2007
In breaking music news, Brisbane band Topology has created a piece called McLibel, based on Britain's longest-running civil trial. According to the composer, bassist Robert Davidson, the band was attracted to the "David and Goliath" element in the story.
The case was, indeed, a story of David and Goliath: one in which a gigantic, unaccountable, transnational monster (the international green movement) used every lie and fabrication in the book against a company (Macca's) that simply tries to provide tasty and convenient meals, employs hundreds of thousands of workers, is accountable to its investors and the market and, unlike the green movement itself, acts strictly within the laws of every country in which it operates.
But the news that we are going to enjoy an original musical composition based on the case, as well as the lurking suspicion that it will not take the view just outlined, started me wondering: at what point, exactly, did the mouthing of an approved set of political pieties become part of the job-description for Australian artists?
Step forward, the late Judith Wright. If she was not the source of the syndrome, the distinguished Australian poet was surely its most potent exemplar. If there was a fussy, daffy campaign going that tried to put a baffle in front of capitalism and progress, Wright was a walk-up start for it. The lady marched more miles than Chairman Mao.
A daughter of rich New England squatters, Wright spent most of her last 30 years combating the very economic freedoms that could potentially allow others to experience the prosperity that she had enjoyed as a birthright.
I take the unorthodox view that the Australian public has better judgment in most matters than the intelligentsia, and the public has surely learned to ignore the political bleatings of artists - which, given the mass sign-on to fascism in the 1920s and '30s, followed by the wholehearted surrender to communism in the '40s and '50s, is both wise and benign.
But Wright was never a communist, and what makes her a key transitional figure is that she was a pioneer of the retreat of the arts into what I call Wetworld.
"Do you mean that dreadful Kevin Costner film, Imre?" Incorrect, Jose. I mean a nation-within-a-nation that boasts its own religion (the Uniting Church), its own political party (the Democrats), its own think-tank (ACOSS), even its own national broadcasting network (the ABC).
At its most harmless, Wetworld is the steady-drip water-torture of a Tim Costello or a Natasha Stott Despoja; at its full-bore water-cannon worst, it is the kind of green-left fascism that we are likely to see directed against the World Economic Forum meeting in Melbourne next month.
Above all, Wetworld has provided a secure platform from which artists and intellectuals have been able to maintain their long campaign against the economic and social arrangements that underwrite their own prosperity and freedom of expression. Nowadays, even artists hostile to the Left, like Les Murray, tend to be conservative wets.
But Wright had arrived at this bad place even before Patrick White first pulled a tea-cosy down over his ears and began his own Long March up and down Oxford Street. All of Wetworld's key obsessions were bees in Wright's bonnet. For example, the idea that, with a land mass the size of the United States and fewer inhabitants than London, Australia was facing a "population crisis".
In the late '60s and '70s, Wright found little time for her poetry, as she sought instead to warn an uncaring nation about the imminence of nuclear and environmental holocausts. (Remember those holocausts, where we had to claw our way over each other's stinking dead bodies to collect scraps of food for our young? Bummer!)
By the end of her life, Wright had arrived at something like the opposite of the faith that every genuine artist clings to: "Anyone can write poetry," She declared (quite incorrectly as it happens), "but to be an activist is far more important."
Rather than turning off the tap, Wright's death in June simply opened the sluice-gates. At a memorial service in Canberra, her biographer, Sister Veronica Brady, foamed against the evils of economic growth and "insane materialism", and referred to the Prime Minister as "the horrible little man who doesn't live here" - surely the strangest thing ever said by a nun in a eulogy.
In an article about the Concorde accident last month, this newspaper published a photograph that showed a serious-looking middle-aged woman holding up a large placard, which read, "Protect Aborigines from The Boom". The photograph was captioned: "Protester at Sydney airport during a Concorde visit in 1972". Guess who.
From “The Sydney Morning Herald” of August 14, 2000
Saturday, May 19, 2007
Those who would love to see their fellowmen live and not die should bless and not condemn the United States. Why? Because it is the combination of bad US imperialism and equally bad US discrimination which leads to that cultural sorting inside Superbrains which gives us our lives.
When the United States was about to send its elite soldiers to Afghanistan president Bush gave them a “pep talk” in which he said: “We in the United States are in no way the first who have increased our power in the world. It has always been an established rule that the weaker is kept down by the stronger. We might even be worthy to have this global hegemony. We might even be a bit commended. Because even if we, too, have yielded to the instinct of human nature to rule over others, the United States has been more observant of justice than we might have been, considering our power. If others should seize our power, they would, we think, like Stalin and Hitler, exhibit the best proof that we show some moderation; but in our case the result of our very reasonableness is, perversely enough, obloquy rather than commendation.”
This quote does not come from Bush. With some small changes, it is 2.430 years older. Thucydides (Loeb 108, p. 129) let an Athenian man say so when he wanted to defend the imperial policy of Athens as an expression of “the instinct of human nature to rule over others”.
So it may be. Today you can find scientists who claim that “dominance behaviour”, like sex and hunger, belong to our most deeply founded behavior in the “reptile brain”.
Secondly, the USA is not only imperialistic. It would give much work to a Swedish “discrimination ombudsman”. In some student papers, for instance, a company for in vitro fertilization looked for egg donations from young ladies with high IQ and sexy curves. Should that really be permitted? No, said some of the members of president Bush’s new “Bioethical Council”. Well, said the chairman, then you may also want to have egg donations from women with known genetic diseases? You say no! Then, where do you want to draw the line?
We in Europe often condemn the United States for its discrimination between different people. There also intelligence testing is fully normal. Entering universities is based on IQ-tests. Companies use “IQ-consultants” who are specialists on knowing about how high an IQ-level an employee should have for a given job.
This can take place because we now know that we are not fully as equal as nice declarations of human rights tell us. Genetic research has shown us that between any two unrelated human beings there are between three and six million differences in our alleles or gene variations, affecting physical and mental capacities.
What is called discrimination may, however, often also be termed cultural sorting. That, to me, is the third step in the long history of mankind.
About four billion years ago, the first life emerged in the form of bacteria. Evolution has required such a long time in order to create us “inflated bacteria”.
Three million years ago, man and the chimpanzee had the same brain volume. They still have about 400 centilitres. We have got one litre more. That is the result of a natural selection that has made us the lords of the earth.
But what is it that during the latest 40 years has given life to equally many new human beings, three thousand millions, as we had during all of our earlier existence, some five million years? What is it the in only forty years has been equally “life giving” as five million unconscious evolution?
Answer: the conscious cultural sorting that has taken place, most consciously in the multinational Superbrains of the United States. That, at least, is my hypothesis.
Sorting of what?
It took nature three million years to increase the brain volume of a human individual by one litre of grey matter. If you unite the individual brains with 1500 centilitres each and you get them to function as one, then we have achieved an increase in the volume of the intelligence base that is three times bigger than that, given by three million years of individual growth.
Behind a verbal veil of mist of terms like “knowledge society, human capital, brain capacity, brain drain and network” what really takes place is a “cultural sorting” of the measurable and highly unequal human intelligence. With about half of the inequality biologically inherited.
Openly in the United States but, if more hidden in this mist, also in our European multinational companies, thousands and hundred thousands of highly gifted brains are united in a similar manner and stimulated to function as brain cells, neurones, inside what I, in my latest two books, have termed “co-thinking Superbrains”.
Out of these emerge most of those innovations which during the latest forty years have more than doubled the production of food on our globe and thus permitted 200,000 new children to survive each single day. It is also through the global networks of these imperialistic companies that this food could be distributed to the poor of the world. Extreme hunger today exists only in nations which are engaged in warfare or in a few remaining communist nations.
Conclusion: The sum of American imperialism and discrimination or, what is the same thing, its global power and conscious sorting of men and women with different physical and mental capacities has, in only forty years, created a global economic power which has been equally “life-giving” as that, which evolution took five million years to give to mankind.
More than anything else, it is this American creation of “cultural sorting within co-thinking Superbrains” that has given three thousand million new children after 1960 their lives.
Thus, those who would love to see their fellowmen live and not die should bless and not condemn the United States.
This article was first published in the Swedish Financial Times, “Finanstidnigen”, Febr. 5th, 2002 with the title: “USA - Our Genetic Destiny”
Friday, May 18, 2007
The row began when a survey indicated that the GMC was registering --and upholding -- a disproportionate number of complaints against foreign practitioners. Some of these complaints turned on the fact that overseas doctors were poorly qualified, others on their rudimentary grasp of English.
Faced with the evidence, the Tory health spokesman, Liam Fox, himself a former GP, came up with an eminently reasonable set of proposals aimed at raising standards. Dr Fox suggested that the background of foreign doctors should be checked (there have been horrendous cases involving physicians who had been struck off for malpractice in their home countries), and that their linguistic ability should be rigorously assessed.
One might have expected these ideas to be wholly uncontroversial. Indeed, it is astonishing that anyone should need to propose them. Yet Labour and Liberal Democrat politicians have been howling with rage, accusing Dr Fox of coded racism.
If there is racism here, it is actually Labour's. For Dr Fox's proposals would primarily have affected practitioners from the rest of the EU, who, under single market rules, are exempt from the linguistic tests that are applied to the nationals of other countries. Although Britain is required, under the European Treaties, to recognise the qualifications of European doctors, Dr Fox has pointed out that there is nothing to prevent individual employers from imposing an additional, linguistic, test on them.
Yet Labour jumped to the conclusion that any reference to foreigners must be to people who are not white. In fact, the problem is not with doctors of Commonwealth origin whose English is usually excellent, but with Europeans. It is a vivid illustration of how powerless we are before Brussels that we cannot subject EU nationals to the same tests that we impose on other foreign professionals.
So patient care must take second place, not only to conventional political correctness, but also to the new strain of Euro-correctness which is flourishing under this administration. Dr Fox has bravely refused to allow the Left to place this matter off limits. It is intolerable that a senior politician in a major democracy should be prevented from speaking out on his own 'portfolio. By breaking the taboo, Dr Fox has done us all a favour.
From The Daily Telegraph, London. 29 September, 2000
Thursday, May 17, 2007
The story drew the attention of Health Minister Annette King, who has asked health officials to brief her on the case.
Matthews is suffering from dry gangrene following cardiac surgery 20 months ago in Christchurch Hospital and his fingers are rotting and dying as a result. Two of his fingers have already rotted off and he is desperate for surgery to amputate two remaining fingers. He told Television One News he had been taking the strongest painkillers available since January last year to combat the often unbearable pain. He used a pair of kitchen scissors earlier this year in desperation to chop off one of his rotting fingers.
Christchurch Hospital general manager Jim Magee said Matthews had been scheduled for surgery in the first week of September. However, Matthews said he received a letter yesterday from the hospital telling him he would have to wait a further six months at least. Magee said Matthews had a history of severe coronary artery disease, heart disease, renal failure, diverticulitis and diabetes, and had suffered three strokes.
Staffing shortages are endemic throughout most New Zealand public hospitals which complain of underfunding. Many patients increasingly opt for private health care if they can afford it.
Matthews said he would not let even his small pet dog to suffer the way he had. "I'd shoot him rather than let him be treated the way I have," he said.
From: AFP 15 August, 2002
Wednesday, May 16, 2007
I have long written about how adherence to the Modern Liberal philosophy requires a childish mentality. The grown up mind, forged in the real world, quickly dismisses the infantile "tooth fairy" mentality of the left where riches are bestowed to one, without labor or merit, in their sleep. Those who mature soon grow out of the Democrats' disdain for personal responsibility, seeing it as more befitting an angry teenager looking for someone to blame than grown men and women seeking to make a better life and a better world.
That Modern Liberals are little more morally and intellectually developed than children came to mind yet again in the wake of the infantile temper tantrum thrown by CNN's leftist correspondent John King. Having once more been bested by harder working journalists, King pitched a childish fit that resembled nothing if not an eight year old who had just been told "no" to a second piece of pie he feels so entitled to but did nothing to help bake.
The story begins when King, dawdling about on the beach in France, caught a harder-working journalist from another network delivering the news that President Ronald Reagan had just died. King -- and CNN -- had, yet again, been scooped.
Rather than respond as would an adult -- by gritting his teeth, tipping his hat to the better reporter, and quietly steeling his determination to get it right the next time -- King lost it. He screamed and yelled and threw things around. It goes without saying that in the accompanying diatribe he blamed everyone except himself for his latest failure.
Unfortunately for King he, like a toddler kicking and screaming at a K-Mart, was so absorbed in his self-pity that he took no notice of his surroundings and the people all around him laughing at his silliness. Worse luck for King was that he was not just surrounded by other people, but also by scores of TV cameras assembled for the sixtieth anniversary of the D-Day invasion. Soon King's childish tantrum may well be as easily available on the Internet as that of such other foolish, self-absorbed brats as Paris Hilton.
It would be easy to dismiss King's infantile outburst as just another tantrum by yet another pampered leftist media prima donna (remember Dan Rather bolting off the set because he didn't get enough face time?) except that this behavior is not limited to just far left reporters. In fact it has become the norm amongst the leftist power elite as a whole.
Today immaturity reigns in the Democratic Party -- from the Clinton frat house to the left's top authors (while conservative titles include the thoughtful "Why We Fight" and "Inside American Education" the bestsellers of the left are "Rush Limbaugh is a Big Fat Idiot", New York Times writer Molly Ivin's "Shrub" -- a fifth grader's jealous play on the name of the popular new kid, Bush -- and her infantile sequel "Bushwhacked."). In fact, not only does immaturity reign, to today's Democrat it is considered a badge of honor.
This truth has not eluded Al Gore, a man who has made his career on trying to read the latest trends and adapting his persona and policies in accordance with whatever is currently politically fashionable. Few will ever forget the 2000 election in which the former Vice President paraded out a revamped wardrobe and a new policy position with the release of every new poll. Sensing today's trend, Gore's most recent incarnation has seen this dull, "centrist" reappear as a tantrum-throwing, red-faced screamer, billowing words calculated and scripted to be as angry and incoherent as any put-upon teenager's -- his mindless rants designed more to boggle the mind then engender serious consideration.
Similarly, Howard Dean, a man who recognized early on that his credentials for leadership were less than nil, figured out that power in the Democratic Party is bestowed not on the reasonable and rational but on those willing to spew the most insane slanders at the loudest volume. Dean played these two notes to near perfection and saw himself catapulted from being the failed former Governor of one of America's tiniest states to within an inch of the Democratic Party's nomination for the highest office in the land.
Sadly Joe Lieberman either never understood the new premium on childish behavior or was too decent to stoop to the level of the Gores and Deans, for, more than anything, it was Lieberman's adult take on the world and his mature presentation of facts that saw him so unceremoniously and overwhelmingly rejected by the Democratic Party.
Anyone who has raised a teenager recognizes the ridiculous hyperbole that has become the standard fare in stump speeches and personal appearances by today's Democratic Party "leaders." Policy statements one presumes are meant to be taken seriously are as ludicrous as any 13 year old who cries "you're ruining my life" over the most trivial of matters or "you're the worst mother ever" in response to being asked to help with a simple chore.
Thus the Patriot Act isn't just "flawed" but rather "designed to steal all of your human rights!!!" Those who disagree with the leftists aren't just "misguided" but rather are all "lying liars". Judges like Miguel Estrada aren't just "of a different mind" but are, as Ted Kennedy called them, "subhuman Neanderthals!!!"
In fact one cannot watch a John Kerry speech without hearing at least a handful of these foolish slanders. President Bush's policies and their consequences are never to be honestly appraised and thoughtfully challenged. Instead they are all -- every one of them -- "the worst" this or "the worst" that in "the "history of the world!!!" While parents of young children are used to such over-the-top silliness from their kids, it wasn't until very recently, when immaturity became the dominate trait of the Democratic Party, that one could expect it as a substitute for legitimate political debate from "serious" candidates for President.
This type of insane hyperbole has become the norm in the Democratic Party because, like the child, the leftist lives in a world of self-importance, where nothing is bigger than the servicing of their immediate wants. A child can believe his parent is "the worst in the history of the world" because the child knows little -- and cares even less -- about the world outside his tiny domain. The same is true of the Modern Liberal. To them little thought is given to long term consequences of their actions or the historical context of their words.
When Democrats want something (such as their own empowerment) it matters not who they hurt nor the damage they do to their neighbors, their nation or the world. When John Kerry slanders the Commander-in-Chief at a time of war, when Howard Dean tries to spread an outright lie about nine eleven, when Ted Kennedy undermines the nomination of a Mexican-American judge they don't think about the troops in the field, the victims of the terror attacks or the young Mexican-American child who might have seen Miguel Estrada as a role model. Today's Democrats, like small children, see only their own immediate gratification and the chance to advance their short-term personal wants.
Perhaps this self-centered and self-seeking nature of the leftist -- and its disastrous effects upon others -- was best exemplified in Kerry's recent rationalizations for his having viciously slandered one million fellow Americans -- people he now cynically dubs his "band of brothers" -- during the Vietnam War. Asked to explain his inexcusable lies Kerry said only "I was young, I was angry and I just wanted the war to end."
I, I, I -- the mantra of the child -- remains the mantra of the Modern Liberal. "I wanted to have sex so I kept foreign leaders waiting." (Bill Clinton). "I didn't want bad publicity so I let Mary Jo Kepechne drown." (Ted Kennedy). "I wanted to be famous so I helped fake the rape of a young black child." (Al Sharpton). And the list goes on and on.
Does John King's childish tantrum rise to this level? Of course not. All he did was embarrass his network, damage some equipment and belittle the people in his charge. But John King is only a reporter seeking a scoop. When the prize is the highest office in the free world this self-centered immaturity can and does bring devastating results. The Democratic Party's penchant for petulance is one badge that John Kerry would do well to this time actually throw away.
Copyright c 2004 Evan Sayet
Evan Sayet was a writer for Television's "Politically Incorrect", wrote and produced the Discovery Channel's "The 70's: When Decades Attack" and is active in the political community.
Evan Sayet can be contacted at ESayet2004@aol.com
The above article originally appeared at BlueStarBase forums at the following address:
Tuesday, May 15, 2007
Call to war: Retrosi's The Complaining Citizen, 1914
'We shall sing the love of danger, energy and boldness!" the Futurist Manifesto shouted from the rooftops in 1909. "We declare that the world's splendour has been enriched by a new beauty: the beauty of speed. There is no more beauty except in strife, no masterpiece without aggressiveness, a violent onslaught upon the unknown forces, to force them to bow to the will of man ...
"We wish to glorify war -- the only hygiene of the world -- militarism, patriotism, the destructive arm of the anarchist, the beautiful ideas that kill!"
The futurists also set out "to destroy the museums, the libraries", adding: "It is in Italy that we launch this manifesto of violence, destructive and incendiary, by which we this day found futurism, because we would deliver Italy from its canker of professors, archeologists, cicerones and antiquaries ... free her from the numberless museums which cover her like so many cemeteries."
Fortunately -- not least for the futurists themselves -- the latter part of the manifesto was never fulfilled, otherwise those detested "museums and libraries" would not be able, nearly a century on, to showcase the achievements of futurist literature and art. A case in point is Barbed Wit: Italian Satire of the Great War, at the Estorick Collection of Modern Italian Art in London, where visitors can admire rarely seen original artwork for the bitingly satirical postcards produced in Italy during World WarI.
Postcards as an art form? Indeed, say the curators of the show -- organised jointly with the Imperial War Museum -- postcards could be "rapidly distributed to a mass public", and in the early 20th century were used to convey social and political messages.
Today most Italians are ambivalent at best about war, if not downright pacifist, as former prime minister Silvio Berlusconi found to his cost when his decision to send Italian forces to help to reconstruct Iraq after the 2003 US-led invasion gave rise to anti-war protests as reconstruction deteriorated rapidly into an occupation fighting an insurgency.
The roots of Italian pacifism lie partly in the bruising experiences of the 1930s and '40s, when Benito Mussolini led his country first into costly colonial adventures and then, after initial hesitation, into a disastrous alliance with Adolf Hitler.
But the roots go even deeper, to World War I, when at the outbreak of hostilities in 1914 Italy formed part of the Triple Alliance with Germany and Austria-Hungary against the Triple Entente: Britain, France and Russia.
Italy was a reluctant warmonger from the start, preferring its traditional neutrality, to the disgust of the futurists. This hesitancy to make a commitment to war is neatly satirised in Virgilio Retrosi's image of a red-faced Italian infantryman pondering whether to follow a signpost pointing to the "European Theatre". "Shall I just be an extra, or should I take a starring role?" runs the caption.
Italian vacillation is also captured in another Retrosi work, To Go or Not to Go, which shows a young woman picking petals off flowers.
Not all Italian artists were jingoistic; Giulio Gigli's postcard imitates the style of another futurist, Gino Severini, in a semi-abstract, dynamic composition that merges the national colours of France, Germany and Belgium with bullets, shrapnel and interspersed words such as "misery" and "snow".
As Nadia Marchiani, who curated an exhibition on Italian art and World War I in Florence in 2006, points out, Umberto Bocciani, one of the foremost futurists, lost much of his bravado after joining the wonderfully named Volunteer Battalion of Cyclists and Automobilists at the front. Boccioni, who once declared that "art is always above war and is not troubled by it", confided distinctly "troubled and eminently human reflections" to his diary after the battle of Dosso Casina, Marchiani says.
The Cezannesque rhythms of his later work, she notes, are in contrast to his earlier explosive experimentalism.
Retrosi also offered a grim vision in his Il Volto della Guerra (The Face of War), with a Gorgon-like head regurgitating skulls and coins. But many futurists remained wedded to the idea of war as "the only hygiene" and used the postcards to mock the prosperous Italian bourgeoisie, depicted as cynics who were happy to profit from war while shirking their patriotic duty.
In Armed Neutrality, Victor Emmanuel III is shown as a diminutive figure peeping out of an excessively armoured suit, shackled by the chains of indecision. Not until the secret Treaty of London in April 1915, when Italy (after being promised territorial gains) switched sides by joining the Triple Entente and officially declared war against Austria-Hungary, did the propagandists finally have a foreign enemy in their sights.
Raffaelo Ferro's design of 1918 admiringly refers to the exploits of the eccentric nationalist poet Gabriele d'Annunzio, who in a daring propaganda stunt flew over Vienna, scattering red, white and green postcards appealing to the Austrians to turn on their government.
Ferro, depicting both poets with laurel wreaths round their heads, uses a quotation from Dante: "Poveri versi miei gettati al vento" ("My poor verses have been scattered to the wind").
Ferro also offers a propagandistic image of the Triple Alliance, with a spiked German helmet clamped over the globe (German Global Domination), while another postcard mocks Giovanni Giolitti, the prime minister, for trying to make territorial gains without going to war. Several postcards show emperor Franz Joseph of Austria-Hungary as a decrepit villain about to be overwhelmed by Italy, the "unstoppable avalanche".
In the end, Italy suffered heavy losses and war turned out to be not so glorious after all, although Italy gained Trento, the South Tyrol, Trieste and Istria in what D'Annunzio referred to as a "mutilated victory".
Dreams of glory, military triumph and even imperialism lingered on, fuelling the rise in 1922 of Mussolini, who (like Hitler) had fought in World WarI.
If futurism remains a sensitive subject in Italy, it is because of its associations with fascism as much as its chauvinism. Filippo Tommaso Marinetti, the futurist movement's acknowledged leader, allied himself -- and futurism -- with Mussolini from an early stage.
Nor is this merely academic: the centre-Left Government of Romano Prodi is facing a dilemma over the centenary of the 1909 futurist declaration. Culture Minister Francesco Rutelli, asked recently by the centre-Right opposition what plans he had to mark the event, replied that futurism "was the most important movement in Italy in the first half of the 20th century", but that he hoped that polemics could be set aside.
Not much chance of that: the second biggest party in the centre-Right opposition after Berlusconi's Forza Italia is the Alleanza Nazionale, a direct, if reformed, descendant of Mussolini's Blackshirts. As so often in Italy, the past informs the present. The Times
Barbed Wit: Italian Satire of the Great War is at the Estorick Collection of Modern Italian Art, London, until March 18.
Article above originally appeared January 22, 2007
Monday, May 14, 2007
MONEY can buy happiness and the best investment advice may be as simple as the sports shoe slogan: just do it. That's the conclusion drawn by researchers who set out to identify what sort of spending made people happiest. The psychologists, from Cornell University and the University of Colorado in the US, compared ``experiential purchases'' -- things such as holidays, concerts or dining out -- with ``material purchases'' such as clothing, beauty products, stereos or personal computers.
``For many of us, deciding how to invest our resources to maximise happiness is a challenge,'' they wrote in a Journal of Personality and Social Psychology report this month. ``We wonder whether more money, more leisure or more stuff would make us happier.'' The researchers asked more than 1500 people to rate their reactions to different purchases in five separate studies.
When asked to rate one of their own recent purchases on a ``happiness scale'' of one to nine, respondents consistently rated experiences about one point higher than material purchases in terms of being ``money well spent'', ``contributing to overall happiness in life'' and providing happy memories. When they directly compared their experiential and material purchases, only 34 per cent said they were happier with material objects.
Even people on very low incomes said extra spending on holidays or concerts made them happier than buying objects for their personal use or around the house.
``Experiences make people happier because they are more open to positive reinterpretation, are a more meaningful part of one's identity, and contribute more to successful social relationships,'' the researchers concluded.
In Australia, experiential spending has fuelled a boom for companies specialising in adventure travel and ``unique experiences'' such as jet-fighter flights, swimming with sharks and race-car driving. ``These experiences have a cachet attached to them,'' said Belinda Wong, manager of adventure specialist Atomic Dog. ``There's only so much to say about your new sound system, but you can talk about jumping out of a plane at a dinner party and suddenly the whole table is impressed.''
After travelling the world for six months, Tricia Hannah was given a tandem skydiving voucher by a friend while visiting her family in Melbourne. ``That six-minute experience jumping out of a plane just ranked above everything else I've done,'' said the 33-year-old accountant, who now lives in Glasgow. ``When I met my friends the next day, the first thing I did was show them my skydiving pictures, rather than all those amazing cities I've been to overseas.''
The above story originally appeared on 12 JAN 2004
Sunday, May 13, 2007
It was meant to be a peaceful anti-war protest by students
WITH bottles and knives in their hands and hate in their hearts, a mob of violent troublemakers yesterday ambushed a student anti-war rally to lead a vicious rampage through Sydney streets. A group of young men, described by police as ``Middle Eastern males'', created havoc by throwing chairs, rocks, bottles, eggs and golf balls at police and media during several hours of chaos in the CBD. Police also seized two knives from protesters, one of which fell on to the ground in the midst of a scuffle.
The violent spectacle began at Town Hall and resulted in two police officers and a number of protesters being injured. The two officers were struck in the head, one by a bottle and another by a golf ball, as they fought to contain the crowd which surged through containment lines. At least 45 mostly teenage participants were arrested -- including a boy aged 10 -- after the rally erupted into violence about 12.30pm.
Assistant Police Commissioner Dick Adams said it was clear a large proportion of protesters had come to the rally ``for the express purpose of fighting police.'' ``We had a group of people who went to Town Hall [for] nothing other than to incite trouble,'' he said. ``A large group of Middle Eastern males started to engage and incite the police in St Andrew's Square [near Town Hall] and they started to pick up cafe furniture from the area and throw it at police.''
Clutching placards condemning Prime Minister John Howard and US President George W. Bush, a 2000-strong group of students -- many wearing their school uniforms -- gathered for the protest at midday.
Yelling profanities and defying police instructions, the mob grew increasingly brazen until a scuffle broke out between protesters and police. The fracas quickly spread to a nearby cafe, where some youths threw chairs from an outdoor seating area at police, photographers and reporters.
Students burnt an American flag before moving on to Hyde Park, where scores of protesters cavorted in the Archibald Fountain. Hundreds of police officers surrounded the exits to the park, ensuring the crowd was contained within its boundaries. After leaving the park, the marchers moved through the Pitt St Mall and Castlereagh St, before congregating outside Mr Howard's Phillip St offices. There, the protesters hurled further insults while as least 50 police, including mounted officers, tried to contain them.
Australian Arabic community leaders condemned the violent clashes but rejected police claims the perpetrators were Middle Eastern men. However, police countered that television footage clearly showed the majority of those arrested appeared to be of the same ethnic background.
(From the Sydney "Daily Telegraph" of THU 27 MAR 2003)
STUDENT anti-war protests turned violent in cities around the nation, reflecting a new, strident mood among Australians frustrated at a federal Government ignoring their message of peace. Dozens of teenagers and other demonstrators were arrested in Sydney when a peace rally ended up in a wild riot. There were also arrests at rallies in Brisbane, while in Melbourne students clashed with police and burned US flags and effigies of the Prime Minister and US President George W. Bush. Mounted police broke up a 500-strong protest in Perth as paint, urine and tomatoes were thrown at the US consulate.
Rally organisers in Sydney claimed the day was a success and warned they would hold a similar protest next week with the aim of shutting down the entire CBD.
``We want to cause as much disruption as possible,'' organiser Jarvis Ryan said.
The Sydney rally began peacefully with about 5000 students gathering at Town Hall Square to demonstrate against war with Iraq, but ended in chaos. With bottles and knives in their hands, ``and hate in their hearts'', according to The Daily Telegraph, a mob of ``violent trouble-makers'' -- described by police as ``Middle Eastern males'' -- ambushed the student rally to lead a ``vicious rampage through Sydney's streets''. Cafe chairs, rocks, bottles, eggs and golf balls were thrown at police, two knives were seized and 33 people were arrested at the rally, some reportedly as young as 10.
The Australian reported ``a minority of protesters burned flags and chanted `Allah is great'''. The arrests occurred, according to The Sydney Morning Herald, when police declared the protest an unlawful assembly and attempted to end it.
The previous day a Newspoll published in The Australian found support for the war had jumped to 50 per cent in recent weeks, offering evidence of a nation genuinely divided.
Prime Minister John Howard -- enjoying a popularity rating more than 41 points higher than his beleaguered Labor counterpart Simon Crean -- pushed for Australia to get a seat at the table for the reconstruction of post-war Iraq, along with the US and Britain. Australia wants administration of Iraq to be transferred to the UN as soon as possible, but accepts the ``moral authority'' of the US to administer Iraq in the war's immediate aftermath.
(From 29 MAR 2003 in "The Australian")
Saturday, May 12, 2007
Thoughtful persons have long compared the totalitarian systems of the twentieth century. Indeed, the application of the word "totalitarian" beyond its original Italian context has been an act of comparison. But since the emergence of Bolshevism and Italian Fascism by the early 1920s, Western scholars - and frequently totalitarian ideologues themselves-have tended to conceptualize the Marxist-Leninist system as a political opposite to Mussolini's Fascist party and regime, as well as to German National Socialism and the various other "fascist" parties in the thirties. The standard political spectrum taught yearly in thousands of college classrooms only makes sense as a product of this specific conceptualization.
On the other hand, from the 1930s onward (in a few cases one may say from the 1920s onward), classical liberals, libertarians, and paleo- conservatives have, to varying extents, rejected the standard political continuum for the very reason that it seemed to be based on inadequate criteria and even false premises. After all, a spectrum that put Communism and Nazism at diametric extremes distorted reality in significant ways. Yet, rigorous comparisons of Communism and Fascism in mainstream of Western intellectual life have, in most cases, been cut short by reverence for the great "intellectual" orthodoxy that Communism was a great and well-meaning experiment which unfortunately created some "excesses."
Both for those who have long contemplated the similarities of the supposedly antipodal "extreme right" and "extreme left," and for those who are just working their way into this fascinating subject, The Faces of Janus will be a welcome and highly illuminating work. A. James Gregor is a prolific authority on both Marxism and Fascism, and he offers us here a work of mature, careful, and extensive scholarship on the relationship between Marxism-Leninism and Fascism.
Gregor begins by pointing out some gross disjunctions in Western theories of twentieth-century revolution. Fairly consistently since the 1930s, academic, literary, and intellectual observers have identified Marxist-Leninist and fascist movements as polar opposites. Although some scholars began to apply the term Totalitarian (which came from the Italian Fascist vocabulary) to both "Right" and "Left" forms of ideologically authoritarian regimes, Western academics continued to view Fascism and Soviet Communism in terms of a strict dichotomy.
Fascism was irrational, Communism was rational, even scientific. Fascism was nationalist, Communism was internationalist. Fascism was selfish and aggressive, Communism was a well-meaning (albeit sometimes bumbling) attempt at universal sharing. Fascism was an evil design, Communism was the Great Experiment. And so forth. Gregor, on the other hand, shows that the failed Marxist-Leninist revolutions do indeed look very much like the failed Fascist revolution of Italy and the various Fascist-like revolutions (including that of the National Socialists). In fact, Gregor finds contradictions to the standard political spectrum not only in Stalin's "socialism in one country" but in Fascist thought as well.
Indeed, one of the valuable contributions of this book is Gregor's examination of the little-emphasized early career of Mussolini as a leading Italian Marxist and syndicalist theorist. By way of a note, since the book is at its core a study of Italian Fascism and Russian Communism, Gregor says little about the National Socialist regime in Germany, though he does point out more than once that Fascism was very much the pioneer, Nazism very much the follower. In fact, he discusses numerous other "fascisms," though his main comparative category is Italian Fascism, with a capital F.
The standard conception of Fascism as the opposite of Marxism-Leninism, Gregor shows, derives directly from the earliest critiques of Mussolini's Fascist movement by Mussolini's former comrades, Italian Marxists, along with Austrian, French, and German Marxists. By the mid-twenties, immediately after the Fascist seizure of power in Italy, Clara Zetkin and other Comintern members worked out the coarse outlines of a Marxist line: Fascism was simply the front for capitalists who were struggling against the working class to bolster "the terroristic dictatorship of big capital."
The Marxist critique became more sophisticated over the next decades, but the vision of Fascism as an essentially inhumane opposite to Marxism remained a staple. In the thirties, R. Palme Dutt summarized many of these elaborations in the form of a standard narrative: Marx showed that the capitalist system must reach a crisis of profitability in which the rate of profit sinks toward zero; the interests of heavy industry and high finance would no longer be able to develop the forces of production; capitalism would have performed its historic role, and the capitalists would have to resort to sheer terror to maintain their power; Fascism represented this sheer terror. According to Dutt, generic fascism was "the most complete expression of the whole tendency of modern capitalism in decay" (p. 34).
Gregor points out that even in the thirties, some Marxist intellectuals were already rejecting the mainstream Comintern theories as unworkable. Both Otto Bauer and Franz Borkenau conceived of the fascist movements as anything but simple fronts for the capitalists. Borkenau, in particular, viewed fascism as a movement whose role was that of a "mass-mobilizing developmental dictatorship under single-party auspices," a transitional form of nationalist authoritarianism which accelerated economic development to bring the economy into line with national power-essentially a "Bonapartist" process. Both Bauer and Borkenau were thinking of Stalin and his nationalization of the revolution in Russia.
After the Second World War, Marxist theories about fascism turn on the death of Stalin in 1953, his denunciation by Khrushchev in 1956, and the subsequent enmity between Russia and China. The mutual name-calling which the Sino-Soviet hostility brought about in the sixties gave ample opportunity for Soviet and Chinese Communist theorists to brand each other as fascists. Theoretically, the important point here was that Marxists were explicitly asserting that fascism could arise in a system that was not capitalist at all, and, hence, could not be a front for capitalists. Fascism was no longer a historical cate- gory but a descriptive term, and a pejorative one, to be used to describe any state monopoly system which exhibited certain features.
Indeed, much of the theory behind the waves of the Western aca- demic analysis of fascism in the 1960s and 1970s, Gregor shows, came directly from Chinese and Soviet critiques of each other. Gregor finds much that is, almost ironically, accurate in these Marxist slanging matches, since both sides did, in fact, possess the characteristics of which they accused each other. Loyal Maoists, Gregor writes, could truthfully show how to avoid the snares of the evil revisionists:
To be a true Maoist revolutionary, to thwart fascists, all one had to do was to obey the Chairman in an orgy of submission that many academicians, East and West, insisted was a defining trait of right-wing extremism.(p. 83)
Indeed, one finds in both the Chinese and Soviet systems endless "fascist" characteristics: the Fuehrerprinzip, the command economy, futuristic irrationality, and much more.
Gregor puts many of his arguments together in discussing the rise of fascist-like movements in Russia and other lands of the former Soviet Union in the 1980s and since the fall of Communism. With roots in the sixties, a strong intellectual movement emerged in the 1980s which assisted the nationalist revival. Sergei Kurginian, for example, was a devoted Communist and the author of influential writings which aimed at "national salvation" through a more powerful state. Kurginian approved of Stalin's hierarchic, inflexible, relentless regime, but he thought Stalin had made his "achievements" despite Marxism, not because of it. Gregor labels Kurginian's ideas as "proto-fascist," and shows that Kurginian's influence on Gennadi Ziuganov, one of the most important leaders in the post-Soviet Communist Party, has been substantial and direct. Others, too, have adopted variants of fascist programs in post-Soviet Russia. Almost all startd out as particularly committed Marxist-Leninists.
The backbone of Gregor's analysis is his concept of "reactive developmental nationalism," a concept which he seems to adapt in part from several of his subjects, especially from Marxist Franz Borkenau and proto-fascist Roberto Michels. Though Gregor does not treat this concept in a systematic way, his counter to the standard "opposites" theory of Marxism and Fascism seems to stem from it.
In brief, reactive developmental nationalism represents, according to Gregor, a tendency which emerges when a "nation" sees the need to forge ahead economically in order to assert its national identity and place in the sun, and when the progress toward this place in the sun seems stymied by some foreign catastrophe or national embarrassment. The result is a "reactive" authoritarianism, an attempt to develop the nation from the top down and to adopt something like the "reactionary modernism" which Jeffrey Herf has written about in the case of German National Socialism. Gregor sees both Marxism-Leninism and Fascism as the progeny of this process.
Classical liberal or libertarian thought dovetails perfectly with Gregor's demonstration of the similarities of the two systems, but many of the readers of this journal will consider his analytical framework of "reactive nationalism" as unnecessarily complicated. The centralization of power has accompanied the Leviathan state since its earliest development some five or six hundred years ago; World War I and its aftermath simply intensified that longstanding tendency. The particular forms of authoritarianism require historical, but not necessarily "theoretical," explanation.
An extension of this critique of Gregor's book is that his tendency to hold up "democracy" as the true counterpoint to both Marxism-Leninism and Fascism demonstrates the weakness of reactive developmental nationalism as an explanatory category. The twentieth century has shown that democracy has been highly creative and vigorous in developing its own patterns of centralization, Leviathanism, imperialism, collectivism, and intervention into the lives of individuals.
This criticism notwithstanding, The Faces of Janus is an outstanding work of careful scholarship which speaks directly to issues long of interest to students of liberty.
(Review originally published in the 1992 Journal of Libertarian Studies, vol. 16 (3), 99-103 and published online as a PDF here)
Friday, May 11, 2007
Christopher Olaf Blum, ed., Critics of the Enlightenment: Readings in the French Counter-Revolutionary Tradition (Wilmington: ISI Books, 2004), 357 pp. $30
Michael Burleigh, Earthly Powers: Religion and Politics in Europe from the French Revolution to the Great War (New York: HarperCollins, 2006), 530 pp. $29.95
Theodore K. Rabb, The Last Days of the Renaissance: and the March to Modernity (New York: Basic Books, 2006), 246 pp. $26.95
John Robertson, The Case for the Enlightenment: Scotland and Naples 1680–1760 (Cambridge University Press, 2005), 455 pp. $95
Rodney Stark, The Victory of Reason: How Christianity Led to Freedom, Capitalism, and Western Success (New York: Random House, 2005), 283 pp. $25.95
If anything can be given credit for holding the West together during the turbulent twentieth century, the struggle against communism probably merits the greatest recognition. Certainly the fall of the Soviet empire engendered an unexpected amount of internal turmoil and self-analysis in the West. The specter of the main external threat having been reduced to tatters, a major “culture war” broke out in the West as both sides fought over what the new definition of Western civilization should be, freed by the knowledge that whatever happened, it would be our decision and not imposed on us from outside. After a brief pause in view of the 9/11 attacks and Afghanistan war, the West's internal conflict resumed, with, if anything, even greater acrimony.
But in fact this conflict has simply been the latest clash between two powerful streams of thought that have been present throughout European history. One stream comes from Greece and Rome, and the other from Calvary. The Western world, sometimes known as Christendom, has always vacillated between the two, with one stream sometimes sweeping history in its direction, and sometimes the other being predominant. Western history is in fact the record of the intellectual turbulence created and sustained by these two currents. To understand where we are today, we have to understand the nature of these two streams and where each would lead us.
Thomas Sowell offered key insights into these two currents of thought in A Conflict of Visions (William Morrow, 1987). Sowell argued that our policy disputes and culture wars are based on two opposing views of the human condition: a “constrained” one, which sees human nature as having inherent limits, and an “unconstrained” one, which sees human beings as perfectible. The former vision, he wrote, sees humans’ immutable limits as suggesting the value of market economies and limited government, whereas the opposite view sees humanity as requiring and able to accommodate a transformation to make us fit a rational social system, devised by the wisest among us, that will solve all our problems. These two worldviews are basically a continuum, Sowell noted, with nearly all individuals falling somewhere between the two poles. Nonetheless, they do represent two underlying views that people actually hold.
Sowell's book delved into intellectual history and specifically the Enlightenment. He identified Adam Smith and the American Founders as representing the constrained vision, and Jean-Jacques Rousseau as exemplifying the unconstrained view: “When Rousseau said that man ‘is born free’ but ‘is everywhere in chains,’ he expressed the essence of the unconstrained vision, in which the fundamental problem is not nature or man but institutions.” Other authors have offered similar schemes and analyses based on “worldview” theses in recent years—such as Francis Fukuyama's The End of History and the Last Man and Samuel Huntington's A Clash of Civilizations. The common theme of this discussion has been the premise that “As a man thinketh in his heart, so is he” (Proverbs 23:7).
The search for the Rosetta Stone of modern ideologies in the West has brought historians and political analysts repeatedly to the Enlightenment. There are two obvious reasons for this. The first is that modernity is the fruit of the Enlightenment, and that studying the latter will help us understand the former. The other is the premise that the Enlightenment was a very good thing in that it released humankind from superstition, backwardness, and irrationality, ushering in an Age of Reason in which the goal would be to transform society and the individual so that both operated along purely rational lines.
Both of these premises are eminently questionable, however. As Gertrude Himmelfarb pointed out in The Roads to Modernity: The British, French, and American Enlightenments (Knopf, 2004), the Enlightenment was anything but a monolithic movement toward some specific destination we have since learned to call modernity. For every Rousseau, Himmelfarb notes, there was an Edmund Burke—and often more. In addition, although numerous schemes to transform individuals and society have been offered and tried, modernity has by no means been exclusively devoted to such efforts—those pesky “constrained” thinkers have continuously been around to point out the flaws in every such plan, and the hoi polloi have perpetually resisted efforts to transform them and their world.1
As useful as Sowell's schema may be, it remains incomplete until we understand where these two worldviews come from and the basic values and assumptions that underlie them. For only then can we fully understand their implications. If we can find in the Enlightenment's turmoil the deeper currents that push us toward either of these two worldviews, we may be able to derive a greater understanding of the conflicts and problems of our own time.
In The Case for the Enlightenment, Oxford historian John Robertson seeks to “make the case for the Enlightenment as a coherent, Europe-wide intellectual movement” based on “the commitment to understanding, and hence to advancing, the causes and conditions of human betterment in this world.” He argues that the central quest of the Enlightenment was to know more about the world so as to gain greater control over nature and society. To demonstrate this, Robertson undertakes a detailed comparison of how this endeavor was manifested in Scotland and Naples, two very different milieus at opposite ends of Europe. Noting that the two kingdoms ultimately converged on a common commitment—to use “political economy as the primary intellectual discourse with which to address a wider ‘public’ among their fellow countrymen, because it held the key to understanding the conditions of betterment in this world”—Robertson argues that “out of two very different ‘national’ contexts came one Enlightenment.”
Robertson maintains that the idea that most animated Enlightenment thinking was the Greek philosophy of Epicureanism. Some advocated it, and some opposed it, but everybody had to contend with it. Robertson notes that in 1697 in his Dictionnaire historique et critique, the French philosopher Pierre Bayle “underlined the superior honesty of the Epicurean account of human nature and its compatibility with the condition of man after the Fall.” Robertson sees this as part of an interesting phenomenon in which Epicureanism was reconciled with the reigning Augustinian view of the human condition and moral thought.
Hence, Robertson concludes, the quest for improvement, despite its Greek roots, was not necessarily hostile toward the Christian religion: “a focus on betterment in this world carried no necessary implication about the existence of the next.” Yet in making the case for Epicureanism, Bayle contributed an important element to the Enlightenment: the idea that a person or society could be moral without belief in God, and that we can understand human history without seeing any role of providence in it. This seems perfectly obvious, but the importance of Bayle's argument cannot be overestimated, because it immediately led to much bigger changes in thinking which animate our current debates over everything from sexual morality to what to teach about human origins. And the fact that these debates have yet to be resolved suggests that, first, the Enlightenment mind was much more conflicted than Robertson believes, and second, that Epicureanism is less compatible with Christianity than he is willing to admit.
David Hume, for example, soon took Bayle's argument much further: “What Hume had done was critically weaken what Bayle always treated as the one secure bulwark against the Epicurean, atheist account of nature and man: the truth of Scripture as God's revelation to Man.” Whereas Giambattista Vico had answered Bayle by developing “an account of human sociability in which men's actions manifested the guiding hand of divine providence,” Hume posited “an account of morality and society as purely human creations, the outcome of a remarkable combination of human nature and artifice.” As Robertson notes, Hume believed that “our moral sentiments, natural and artificial, really accord with what we find useful and agreeable in this world. In our morals, we are—and we are the better for being—sociable atheists.” In other words, Hume rejects the notion of original sin and the idea that human governments are established to save us from ourselves and one another. Although he avoided making explicit public expression of his unbelief, Hume not only rejected Christianity as a fact, he sought to make it intellectually unnecessary to the establishment of human happiness.
This is a great difference in visions, as exemplified by the gulf between Hume and Vico, and if both visions led to an increased interest in political economy, the differences nonetheless remain overwhelmingly evident. Robertson notes, for example, that Adam Smith disagreed with Rousseau's account of the development of human sociability, and that he repeatedly invoked in his works “the idea of a natural moral order presided over by a provident deity, whose interventions are like an ‘invisible hand.”’ That is a good observation on Robertson's part, but one would like to see him draw the obvious conclusion: that Smith and Rousseau were on two entirely different tracks, and that the intellectual currents leading to and from them may likewise have been very different from each other.
Was any of this philosophizing—and in particular this conflict of visions—really unique to the Enlightenment period? In The Last Days of the Renaissance, Princeton University historian Theodore Rabb seeks “to identify a succession of fundamental shifts in historical periods from the Middle Ages to the present, with special attention to the time when the Renaissance dissolved into the Age of Revolution” (his term for the Enlightenment). Rabb makes a strong case that medieval Europe was far more unified than is commonly thought and that it was more like the Renaissance than is usually appreciated. Regarding political organization, for example, Rabb writes, “Although the pyramidal hierarchic model was always taken for granted—that is, one or a few at the top of the social order, enjoying a God-given right to rule, and then increasing numbers at each level of descent down to the mass of the people at the bottom,” it is important to remember that “localities throughout Europe also observed, during the Middle Ages, crucial rights of representation and consultation that gave Europe its unique political and legal character.”
In distinct contrast to today's secular, post-Enlightenment worldview, Rabb points out that the foremost assumption of medieval Europe was “the determination to bring religious beliefs to bear on every aspect of existence. The supernatural gave shape and meaning to all human affairs. For scholars and theologians, the task was to explain how that influence operated and how it was to be understood.” Europe was Christendom, plain and simple. The stream of thought from Greece and Rome was almost entirely dry, lost in obscure, dusty libraries or preserved, ironically, in Christian monasteries and the thinking of Christian theologians such as Thomas Aquinas.
This began to change in the fourteenth—not the eighteenth—century. The “assault on the values of the Middle Ages” followed challenges to the papacy's worldly authority by secular rulers. Rabb contends, “Ideas followed action as theorists drew sharper lines between the authority of the papacy and the power of the princes.” Marsiglio of Padua, for example, pointed out in 1324 that Jesus Christ had consented to the authority of the Roman emperor in going willingly to his crucifixion.
Rabb observes that the “widespread and effective aggressiveness of secular rulers, particularly toward the Church, was to be a dominant feature of Renaissance Europe.” By the mid-1400s, “it was unmistakable that the papacy had lost its ability to challenge the monarchs it had once cowed. A new era had been born.” The increasing unwillingness of secular rulers to bow to the papacy was accompanied by “growing dissatisfaction with religious doctrine and practice.” Rabb points out the powerful challenges to the Church's hierarchy that arose from John Wycliffe in England beginning in the 1370s and from Jan Hus in Bohemia shortly thereafter. Both were anathematized, and Hus was executed in 1415.
What is particularly interesting about these Renaissance-era religious reformers—and I use that word intentionally—was how strongly their dissents resembled those of the Reformers of the sixteenth century. Rabb strenuously advocates including the Reformation era as simply part of the Renaissance, but this appears to obscure the key distinction between the two main streams of thought in post-medieval Europe: Christianity and humanism. A central element of the Renaissance was the return to ideas and values of classical antiquity, of Greece and Rome. But where the Church embraced the Renaissance to some degree, Wycliffe and Hus can hardly be seen as part of that trend. What Wycliffe and Hus were calling for was a return to the ideas, policies, and rituals of the early Christian church. Their road led not to Rome or Athens but instead to Asia Minor. Wycliffe raised “doubts about ecclesiastical finances, formal ceremonies, and clerical behavior” and demanded “greater reliance on the Bible,” as Rabb notes, and Hus called for “a religion centered on biblical precept, individual faith, and a more egalitarian ritual.”
Those are exactly the things the Protestant Reformers sought a century later. There was a continuous line of intellectual activity, moreover, leading from Wycliffe and Hus to Luther and Calvin. The former can be seen as early Reformers, and the Renaissance as a greater bifurcation of the European mind: stronger secularism and stronger religion at the same time. Some took the road to Rome and Athens, and some took the road to Calvary. This process accelerated during the Enlightenment, and it is what has made modernity what it is.
It makes a good deal of sense, then, to include the Reformation as part of the Renaissance, but only if we make sure to appreciate the great flowering of the Christian religion in Europe during this period. The Renaissance can be seen as an increasing appreciation for the ideas of antiquity, but can be fully understood as such only if we include the return to early Christian religious values as well as the philosophical legacy of Greece and Rome. Such an appreciation also helps us make sense of the perpetual tug of war between the two mentalities in the West throughout the ensuing centuries.
Rabb emphasizes the similarities between what he calls the Age of Revolution and our present situation:
In the same way as the rationalism of the Enlightenment prompted the Romantics to call for a return to the emotions, so the growing skepticism, internationalism, and materialism of the decades after World War II have been met by a resurgence of moral and religious passion and demands for a reassertion of ‘traditional’ values and local interests. The contest between these forces, often reduced to such simple dichotomies as The Market vs. The Welfare State or Islam vs. The West or even The Culture Wars, may cause ever-widening cleavages and even fiercer battles. If the past is any guide, however, it will eventually become clear how the world wishes to move forward, and the coherence of the age will take shape anew.
What Rabb is describing here is the continued conflict between the two great streams of European thought. Indeed, much of Rabb's book is devoted to analysis of the Renaissance mind and the transformation into what he calls the Age of Revolution, as the book's title suggests, and the phenomena he describes continually demonstrate that the same two streams of thought animated both that time and the subsequent eras. The sixteenth-century artistic movement of Mannerism, for example, with its “unsettling distortions,” a “focus on agonized figures and swirling, unstable compositions,” the use of “figures given inexplicable gestures and odd postures,” and predilection for “disturbing subjects,” strikes Rabb as “symptomatic of a world in which doubts were growing and answers remained elusive.” These also sound like the same things that drove modern art in the twentieth century.
Similarly, the roots of today's relativism and multiculturalism are distinctly evident in Rabb's description of Renaissance-era skepticism: “With antagonists asserting exclusive and incompatible versions of truth, it seemed only appropriate to question the very nature of such claims.” Even the details of the arguments are familiar. Rabb quotes Michel de Montaigne after the latter's encounter with a cannibal who had been brought to France, in which the French writer calls his own society barbarous while entirely absolving the cannibal of any moral responsibility: “I find that there is nothing barbarous or savage in this nation, except that we call barbarism whatever we ourselves do not do. Indeed, we seem to have no definition of truth and reason other than the opinions and customs of the place where we live.”
Likewise, Rabb quotes the Spanish friar Bartolomé de las Casas as scathingly describing the incursion of Europeans into the New World:
As Rabb points out, “These voices were exceptional, but they were widely heard.” And they are widely heard today as well. In fact, the words of de las Casas are reminiscent of many such critiques of Western imperialism in recent decades.
God made all the peoples of [the New World], many and varied as they are, as open and innocent as can be imagined. It was upon these gentle lambs that from the very first day they clapped eyes on them the Spanish fell like ravening wolves upon the fold, or like tigers and savage lions who have not eaten meat for days. The pattern established at the outset has remained unchanged to this day, and the Spaniards still do nothing save tear the natives to shreds, murder them and inflict upon them untold misery, suffering and distress, tormenting, harrying and persecuting mercilessly.
Political liberty also has its roots in the Renaissance, Rabb observes. “Especially noteworthy was the creation of a quite distinct republican political tradition. Its roots grew in a handful of prosperous and relatively independent cities in the late Middle Ages, and it drew inspiration from the memory of Republican Rome, but it began to flower only in the fourteenth and subsequent centuries.” Venice, Switzerland, the Netherlands, and Oliver Cromwell's Britain “all emphasized the ‘liberties’ of their citizens in contrast to the ‘subject’ status of those who were ruled by hereditary princes.” (It is worth noting that the places where this political philosophy hung on most tenaciously were Protestant.) The call for political liberty rose in response to the centralizing tendencies of early-Renaissance monarchs, and it clearly presaged current-day debates over subsidiarity and federalism.
The scientific method arose during the Renaissance, not the Enlightenment, and it has been a permanent part of the scene since. The similarities between current thoughts and all the other trends of the Renaissance that Rabb describes are equally clear. All of these phenomena point us toward an inescapable conclusion that Rabb does not himself bring out: what appears to have happened during the Renaissance was not a distinct, coherent line of thinking but instead an open divergence of two streams of thought that quickly began a great struggle for dominance and initiated a mutual pattern of ebb and flow that persists to this day.
In Earthly Powers, British historian Michael Burleigh extends this story from the Enlightenment to the First World War. Burleigh explores the idea of political religions—the way in which modern-era politics have consistently taken on a cast of religious fervor among those who have no religion—and how moderns base non- and even anti-Christian dogmas on Christian ideas and customs. Burleigh's book counteracts the temptation to underestimate the extent to which Christianity permeates our culture and has done so even when that influence is least evident. His book thus questions the tendency to think that the past couple of centuries have brought a thorough or irreversible secularization of the West.
Burleigh notes that after 1789, Edmund Burke's “key insight was to realize that ‘a theory concerning government may become as much a cause of fanaticism as a dogma in religion.”’2 In fact, it was in making secular progress into a religion that the French Revolution was perhaps most revolutionary. Burleigh points out that “it had its creeds, liturgies and sacred texts, its own vocabulary of virtues and vices, and, last but not least, the ambition of regenerating mankind itself, even if it denied divine intervention or the afterlife. The result was a series of deified abstractions worshipped through the denatured language and liturgy of Christianity.”
Burleigh documents the myriad ways in which “the discourse of the Revolution was saturated with religious terminology: words like catechism, credo, fanatical, gospel, martyr, missionary, propaganda, sacrament, sermon, zealot, were transferred from a religious to a political context.” The attempted adoption of a new calendar and the deliberate suppression of the Catholic Church and its clergy were both part of the same process: the removal of all competing institutions and even habits of thought and behavior that might impede the creation of a new type of human being, one fit for a society incarnating the Jacobins’ “abstract vision of community, harmony and national unity.”
Although Burleigh cautions us not to forget the distinctive nature of the Jacobins, he makes the obvious connection to modern efforts to remake the world through political religions, such as Soviet and Chinese communism and German national socialism. For example, mass murders of the political opposition, or those merely caught in the crossfire, were all too common during the Terror—and the methods employed were as efficient and coldhearted as those of modern regimes. When the guillotine was not fast enough in Lyons in early 1794, “the government's soldiers used cannonfire to gun down large batches of prisoners, with swordsmen finishing off those left half dead by rounds of grapeshot.” That same year, the Jacobins used mass drownings in Nantes to kill off “enemies of the Revolution,” claiming some 1,800 victims in this gruesome way. Burleigh notes that through these and other atrocities “up to a third of the population perished, a statistic roughly equivalent to the horrors of twentieth-century Cambodia.”
As the Terror was giving way to Napoleon and his willingness to make peace with the Church as long as he could keep it under his thumb, the Romantic era brought a somewhat greater appreciation of Christianity throughout the Continent. It was “a great age of Christian faith,” as Burleigh notes, but “it was also an age of publicly aired religious doubt.” Both streams of European history were in full swell, and the result was great turbulence. In France, a strong counter-Revolutionary line of thinking arose, much of it aggressively Catholic in nature. Appreciation for the Middle Ages increased, and writers such as Joseph de Maistre and Louis de Bonald openly praised the lost medieval order.
In Critics of the Enlightenment, editor Christopher Olaf Blum presents representative works from six such thinkers. The book provides powerful evidence of the vitality of Christian thought in the nineteenth century, as exemplified by the publication of Chateaubriand's The Genius of Christianity in 1802 in France, where praising Europe's Christian past had been unthinkable just a few years earlier. Chateaubriand's book kicked off a greater respect for the Catholic Church and the Middle Ages among some French intellectuals. The writers were aggressively and unabashedly Catholic partisans. Joseph de Maistre, writing in 1798, claimed, “Europe has one great enemy. . . . It is a fatal ulcer that attaches itself to all sovereign powers and eats away at them. It is the son of pride, the father of anarchy, the universal solvent: Protestantism.”
Louis Bonald echoed those sentiments in 1815, but later writers such as Frédéric Le Play got past that and were willing to offer specific reforms to reverse some of the more damaging policies that had been implemented in the name of social progress. Some of these arguments sound very familiar to the contemporary reader. In his 1864 book Social Reform in France, for example, Le Play noted, “As with all social institutions, the family is today the subject of lively controversy, and the errors that have been published about it greatly trouble our minds.”
The struggles between church and state during the nineteenth century brought about a painful culture war, Burleigh notes. Efforts to rethink society from scratch were legion, from Auguste Comte's Positivism to the utopian socialist experiments of Robert Owen in Britain and America to the communism of Marx and Engels. What they all had in common was a denial of the Christian doctrine of original sin and a consequent belief in human perfectibility—and the requirement that humans be perfected in order to live in the transformed society. Many churches responded with a greater emphasis on social improvement, to the point that Christian Socialist movements began to form as early as 1850 in England.
As churches slipped away from their pastoral functions and took up social causes, religious observance waned. In Berlin in 1869, only 1 percent of the people in working-class parishes attended church on Sundays; by the end of the century, the German middle classes had “largely distanced themselves from the Churches, viewing them coolly as survivals from a world that had passed.” They were materialistic and in the sway of a “vulgar scientism,” Burleigh notes. The only real religion in Germany was nationalism, and the anti-Christian anti-Semitism of Paul Lagarde and his openly pagan, Teutonic-roots followers rushed in to fill the nation's spiritual void, with horrific results, once “given a massive impetus by the cataclysm of the First World War and the turmoil that followed it.”
Burleigh's book stops there, with the observation that the “Great War, the domestic and international civil wars, and economic dislocation that followed it, gave rise to mass despair, to which the solution appeared to be various forms of authoritarianism.” With the hollowing out of Christian faith in Europe, the atheist, humanist, secular tide had finally risen to its crest. It remains high there, and Christianity has gone nearly dry in Europe, even as it grows exponentially in Africa, Asia, and South America and remains about as high as usual in the United States.
In The Victory of Reason, Baylor University professor Rodney Stark makes a bold claim about where these two streams lead: “The success of the West, including the rise of science, rested entirely on religious foundations, and the people who brought it about were devout Christians.” And not just Protestants, Stark emphasizes, but the previous fifteen hundred years of Christianity as well. Christianity's devotion to theology creates a “faith in reason” which Stark says is precisely what has made Christendom great: “From very early days, Christian theologians have assumed that the application of reason can yield an increasingly accurate understanding of God's will” (emphasis in original).
Rejecting the claim that the source of modern science was Greece and Rome by way of the Renaissance and Enlightenment, Stark writes,
Stark states, “The so-called Scientific Revolution of the sixteenth century has been misinterpreted by those wishing to assert an inherent conflict between religion and science.” On the contrary, writes Stark, “these achievements were the culmination of many centuries of systematic progress by medieval Scholastics, sustained by that uniquely Christian twelfth-century invention, the university.” Real science, Stark notes, “arose only once: in Europe.” Other civilizations had alchemy, but “only in Europe did alchemy develop into chemistry.”
The rise of science was not an extension of classical learning. It was the natural outgrowth of Christian doctrine. . . . Because God is perfect, his handiwork functions in accord with immutable principles. By the full use of our God-given powers of reason and observation, it ought to be possible to discover these principles. These were the crucial ideas that explain why science arose in Christian Europe and nowhere else.
Stark surveys European history from the fall of Rome to the present and demonstrates that Christians and Christian ideas were behind all the great achievements of the West. Pointing out that Christianity's sense of human individuality, moral equality, and belief in free will were far more conducive to human creativity than Greek and Roman assumptions were, Stark thoroughly refutes the Enlightenment-era fiction that between the fall of Rome and the medieval era, Europe languished in Dark Ages (an argument Rabb likewise rejects). On the contrary, Stark writes, “Rapid intellectual and material progress began as soon as Europeans escaped from the stultifying grip of Roman repression and mistaken Greek idealism.”
Calling the Dark Ages “an era of extraordinary invention and innovation,” Stark points out the astonishing breadth and variety of progress in Europe after the fall of Rome. He documents the exponential increase in power generation from water, wind, and horses; the growth of productivity in agriculture with the innovation of the three-field system and the development of the heavy, wheeled plow; the rise of fish farming; and innovations in cloth-making. He identifies powerful innovations in land and sea transportation, in the arts and literature, and in science and education that occurred long before the Renaissance. Finally, Stark shows that the rise of capitalism began early in the ninth century—and was invented by Catholic monks.
All of these things, especially the rise of capitalism, set the stage for political progress. Stark writes, “The success of the West depended on the development of free societies able to provide secure havens for early capitalism. Here too, Christianity played the key role, providing a moral basis for democracy far beyond anything envisioned by classical philosophers.”
The current implications of all this are quite evident. Stark flatly states, “Christianity created Western Civilization. . . . The modern world arose only in Christian societies. Not in Islam. Not in Asia. Not in a ‘secular’ society—there having been none.” If there had been only the Greek and Roman stream in European thought, things would be quite different. Stark writes:
Without a theology committed to reason, progress, and moral equality, today the entire world would be about where non-European societies were in, say, 1800: A world with many astrologers and alchemists but no scientists. A world of despots, lacking universities, banks, factories, eyeglasses, chimneys, and pianos. . . a world truly living in “dark ages.”
Of course, other societies have benefited from this progress, but “all the modernization that has. . . occurred outside Christendom was imported from the West, often brought by colonizers and missionaries.” Hence, Stark argues, “It seems doubtful that an effective modern economy can be created without adopting capitalism,” and “Without secure property rights and substantial individual freedom, modern societies cannot fully emerge.” Stark cites Russia, China, and the Islamic nations as places where these conditions do not currently exist. However, he notes, Christianity is spreading rapidly around the globe—in Latin America, Africa, and China, tens of millions of people have become Christian in recent years. There may be 100 million Christians in China today, and about half of all subsaharan Africans are Christian.
Religious historian Philip Jenkins makes this point powerfully in The New Christendom: The Coming of Global Christianity (2002) and The New Faces of Christianity: Believing the Bible in the Global South (2006). Jenkins shows how rapidly Christianity is growing in most of the world while it recedes in Europe and Canada, emphasizing the differences between “northern” and “southern” forms of Christianity, with the latter being more like the early church. Exploring the implications of this demographic shift, Jenkins foresees possible future conflicts within the church and between surging southern Christianity and Islam.
Taking this argument further, the Catholic writer George Weigel argues that the secularization of the West has brought on a civilizational downward spiral that is particularly dire in Europe. Weigel observes, “The present eu project grew out of the work of three great Catholic statesmen: France's Robert Schuman, Italy's Alcide de Gasperi, and Germany's Konrad Adenauer. That distinguished triumvirate understood their work on Europe's economic and political integration as an expression of their commitment to revitalizing the Christian civilization of Europe after the catastrophe that was the first half of the twentieth century. It is very hard, however, to find traces of that Christian vision of Europe in today's eu.” Weigel observes a powerful antireligious bias in incidents such as the eu's 2006 decisions affirming same-sex marriage and not permitting doctors to refuse to perform abortions. He argues that such policies “confirm the suspicion that the eu bureaucracy is determined to impose lifestyle libertinism on all of Europe in the name of fundamental human rights.”3
Jenkins observes that the United States is much closer to Africa culturally than Europe is: “America is somewhere in between,” he said in an October 2005 interview. “In terms of its values, it may have as much to do with Africa as it does with Europe, which I think is a difference that explains a lot of the political divides between America and Europe. Americans take religious arguments more seriously.”4 This difference bodes ill for Europe, however, as Faith and Reason Institute president Robert Royal argues that modern democratic societies depend on a Christian view of the dignity of the human person and the health and survival of free institutions.5
Stark points out that many around the world find Christianity attractive precisely because it appeals to reason and “is so inseparably linked to the rise of Western Civilization.” To them, to be modern and rational means to be a Christian. But there is another side to this, which Stark mentions all too briefly. If his argument is correct, it strongly implies that any effort to foster democracy and market capitalism in, say, Islamic nations is doomed to fail. In addition, it suggests that Europe is in for some truly dark ages unless Christianity is necessary only to the establishment of economic and political freedom and that these can be sustained perpetually without it once they have taken hold—which seems unlikely over the long term, especially given the eu's increasing inroads against economic freedom, political pluralism, and Christian values.
If this is true, then the hostility of Islam toward the West is unlikely to recede any time soon, and the split between Europe and the United States will probably worsen in the years to come, unless Europe should undergo an unlikely religious revival. The good news is that numerous allies should arise to the south and across the Pacific over time, though they will not have much economic and strategic power for a while. All of which suggests that for the foreseeable future, the United States will continue to bear high responsibilities in the world as a sole superpower with few strong allies.
1 An excellent analysis of how tenaciously people hold on to their core beliefs is Paul Hollander's The End of Commitment: Intellectuals, Revolutionaries, and Political Morality (Chicago: Ivan R. Dee, 2006).2 Hollander's The End of Commitment provides a thorough consideration of this phenomenon as it was manifested in the twentieth century.3 George Weigel, “Europe's Two Culture Wars,” The Catholic Difference (syndicated column), Mar. 29, 2006.4 “Philip Jenkins: The Wittenburg Door Interview,” Wittenburg Door Online Extra, October 2005: at www.wittenburgdoor.com. 5 Robert Royal, The God That Did Not Fail: How Religion Built and Sustains the West (New York City: Encounter Books, 2006).
Samuel T. Karnick (firstname.lastname@example.org) is an associate fellow of the Sagamore Institute for Policy Research and director of publications and senior editor at the Heartland Institute.
From: Orbis: Volume 51, Issue 1, Winter 2007, Pages 174-187