Monday, January 16, 2012

This one is by Professor Edward Ferer's blog -- his latest posting. I cite Feser a number of times in my book. I've learned a great deal from him (his books and blog cite I both enthusiastically recommend). He has a first-rate mind and is a thorough-going Thomist. There is much in this piece that is very close to my own treatment of the subject. I know that when I find I am in agreement with Prof. Feser, my thinking is probably squarely in the great Catholic intellectual tradition. Prof. Feser argues here that that rich tradition, which has no "sell by" date, is what can speak to men of any age. He implies thereby what I argue in my book that its unnecessay (I argue that it is a profound mistake) to attempt to pursuade modern man with modern thought.


Point of contact

Bruce Charlton identifies six problems for modern Christian apologists, and proposes a solution.  His remarks are all interesting, but I want to focus on the first and most fundamental of the problems he identifies, which is that the metaphysical and moral knowledge that even pagans had in the ancient world can no longer be taken for granted:

Christianity is a much bigger jump from secular modernity than from paganism.  Christianity seemed like a completion of paganism - a step or two further in the same direction and building on what was already there: souls and their survival beyond death, the intrinsic nature of sin, the activities of invisible powers and so on.  With moderns there is nothing to build on (except perhaps childhood memories or alternative realities glimpsed through art and literature).

From this problem follow several others.  Bruce continues:

Modern Christianity as experienced by converts tends to be incomplete - precisely because modern Christianity has nothing to build on.  This means that modern incomplete Christianity lacks explanatory power, seems to have little or nothing to say about what seem to be the main problems of living.  For example, modern Christianity seems to have nothing to do with politics, law, art, philosophy or science; to inhabit a tiny, shrinking realm cut-off from daily concerns.

and

Modern Christianity often feels shallow - it seems to rely on diktat of scripture and the Church - this is because moderns lack a basis in the spontaneous perceptions of Natural Law, animism, the sense of active supernatural power in everyday life.  Modern Christianity (after the first flush of the conversion experience) thus feels dry, abstract, legalistic, prohibitive, uninvolving, lacking in purpose. 

As they say, read the whole thing.  There is, I think, much truth in what Bruce has to say.  To be sure, I don’t for a moment think (and I take it that Bruce doesn’t think) that Christianity really is in fact “shallow,” “incomplete,” “dry,” “lacking in purpose,” devoid of “explanatory power,” with “nothing to build on” by way of common ground with secular modernity, etc.  Quite the opposite.  But I agree that it can seem that way to many modern people.  (It more or less seemed that way to me in my atheist days, before I discovered what Christianity, and in particular Catholicism, actually said -- that is to say, what its greatest representatives have actually held historically, as contrasted with the distortions of Christianity, whether liberal or fundamentalist, that have replaced it in much of the public mind.)

The problem, in part, is one of historical and cultural circumstances.  Take a simple example, the Christian description of Jesus Christ as Lord and Savior.  To modern people this sort of talk can sound unbearably mawkish; indeed, I sometimes find it unbearably mawkish, unless the context is such as to counteract the awful cultural associations that have come to surround it.  Hence, if I’m hearing a reference to Jesus as Lord or Savior in the context of the Mass (whether the extraordinary form or the ordinary form celebrated in a dignified way), it does not bother me at all; but if I hear it uttered by a televangelist, I feel (perhaps like a Dawkins or a Hitchens would) an irresistible urge to change the channel.  

Think, though, of the associations a word like “Lord” would have to someone in the ancient or medieval world -- it would bring to mind an emperor, or an aristocrat.  Think of what “Savior” would mean in a cultural context where ancient local communities were being swallowed up by ruthless and seemingly invincible empires, and where rigorist moral systems like Stoicism and Neo-Platonism competed for the allegiance of the intelligentsia -- that is to say, where people had an ongoing sense both of being in real physical jeopardy and of continual personal moral failure.   A description of Jesus of Nazareth as “Lord” and “Savior” would have the reverse of the sentimental and effeminate connotations secularists hear in it now -- it might bring to mind a stern Constantine riding to the rescue on horseback, say, rather than a Mister Rogers with long hair and sandals, ready with a smile and a Band Aid for your spiritual boo-boos.

Combine the egalitarian politics, easy morals, and relative affluence and social stability of recent decades, and few people in the modern secular world are looking for a Lord or Savior in a sense the ancients and medievals would have understood.  Add to that the fact that “Jesus is Lord!” has become the expression of a therapeutic, emotionalistic religiosity conveyed through mass-produced T-shirts, bumper stickers, and bad music, and the whole idea is bound to the modern secularist to seem unintelligible and repulsively tacky.  (Scratch a New Atheist and you’ll often find that this is the kind of stuff he’s reacting against, and all he’s ever known of Christianity.)

So that’s part of the problem.  But that can be remedied if proponents of a muscular and intellectually rigorous form of Christianity -- that is to say, of Christianity simpliciter, as it existed historically -- rediscover their ancient heritage.  For they will thereby rediscover too the heritage of the pagan world, and find in it the resources to communicate with modern man, indeed with any man.  Aristotelians and Neo-Platonists knew that God exists, they knew that man is not a purely material creature, and they knew that good and bad are objective features of the world and that reason directs us to pursue the good.  They knew these things through philosophical arguments which have lost none of their force, arguments which were picked up and refined by Christian thinkers and which informed the great Scholastic tradition.  

As Pope Leo XIII beautifully put it in Aeterni Patris, the intellectual treasures of the pagans are like the gold and silver vessels the Israelites took out of Egypt, ready for deployment in the service of the true religion.  Thus did the Scholasticism whose revival this encyclical fostered happily adopt whatever was of value in the thought of Greeks and Romans, Jews and Arabs.  With philosophy as with art, literature, and architecture, if you want to learn what the greatest non-Christians had to offer, come to the Church, which absorbs and protects it -- honoring our divinely given nature and its products even as she raises them higher through grace.  She reminds man of what he already knows, or can know, through his own powers, before revealing to him truths he could not arrive at under his own steam.  She speaks to him in his own language -- the language of natural theology and natural law, which are in principle accessible to all, and have no “sell by” date.  Even modern secularists know this language, for they are no less human than their pagan ancestors.  The problem is that they speak it at only a grade school or even kindergarten level, whereas the greatest of the ancients at least had high school level proficiency.  But through “remedial education” they, like the ancient pagans, can be prepared for the graduate level work afforded by divine revelation.

This is, of course, the idea of what Aquinas called the praeambula fidei -- the preambles of faith, by which philosophy opens the door for revelation (where faith and revelation, keep in mind, are when rightly understood in no way contrary to reason but build on it -- I have explained how in the first half of a previous post).  But this brings us to another problem.  Like the Pharisee who scorns the sincere piety and virtue of the Samaritan, some Christians scorn natural theology and natural law as impious or at least questionable.  They either despise human nature, and with it any non-Christian understanding of God and morality, as altogether corrupt and without value; or they are willing at least verbally to affirm that nature, but only if it is effectively absorbed into the order of grace, like the Monophysite who is willing to acknowledge Christ’s human nature only if it is first completely divinized.  On the former tendency, faith alone and scripture alone must suffice to bring one to Christianity, preambles be damned.  On the latter, human nature is conceived of in a way which (to borrow a phrase from Pope Pius XII) threatens to “destroy the gratuity of the supernatural order” by taking the natural up into the supernatural, in effect treating natural theology and natural law as if only the Christian can understand them aright.  In both cases Christianity can come to seem a matter of mere diktat (as Bruce Charlton puts it) -- fideistic, inaccessible from and irrelevant to the world of the non-believer.

The first tendency, obviously, is associated with Luther and Calvin, though it is only fair to acknowledge that there are Protestants who have resisted it.  All the same, their resistance is itself often resisted by their coreligionists, as is illustrated by a famous dispute between the 20th century Protestant theologians Emil Brunner and Karl Barth.  Brunner argued that natural theology represents a “point of contact” between human nature and divine revelation, by which the former might be able to receive the latter (though even Brunner qualified his notion of “natural theology,” lest it imply the certainty of God’s existence through natural reason alone that is affirmed by Catholicism).  Barth responded angrily (in a work with the pithy title “No!”), rejecting any suggestion that human nature contributes something to the “encounter” between God and man and arguing that any needed “point of contact” was itself provided by revelation rather than human nature.  This is a little like saying that billiard ball A knocks into billiard ball B by hitting, not B’s surface, but a surface provided by A.  If intelligible at all, it only pushes the problem back a stage: How does the surface provided by A itself have any efficacy vis-à-vis B?  And how does the “point of contact” provided by revelation itself make any contact with human nature?

It is also only fair to point out that some modern Catholic thinkers have taken views which at least flirt with the second tendency I described above -- though in part under the influence of Barth.  Hans Urs von Balthasar sought to meet Barth halfway by rejecting the conception of man’s natural state developed within the Thomistic tradition and central to the Neo-Scholasticism fostered by Leo’s Aeterni Patris (a conception which I described in a recent post on original sin).  On this traditional view, the natural end of human beings is to know God, but only in a limited way.  The intimate, “face to face” knowledge of the divine nature that constitutes the beatific vision is something we are not destined for by nature, but is an entirely supernatural gift made available to us only through Christ.  In place of this doctrine, Balthasar put the teaching of his fellow Nouvelle Théologie proponent Henri de Lubac, who held that this supernatural end is something toward which we are ordered by nature.  Whether it is even coherent to maintain that a supernatural gift can be our natural end, and whether de Lubac’s teaching can ultimately be reconciled with the traditional Catholic doctrine of the “gratuity of the supernatural order” reasserted by Pius XII, have for several decades now been matters of fierce controversy.  But the apparent (even if unintended) implication of the position staked out by de Lubac and Balthasar is that there is no such thing as a human nature intelligible apart from grace and apart from Christian revelation.  And it is in that case hard to see how there could be a natural theology and natural law intelligible to someone not already convinced of the truth of that revelation.

Related to this is Etienne Gilson’s tendency to deemphasize the Aristotelian core of Aquinas’s system and to present it instead as a distinctively “Christian philosophy.”  As Ralph McInerny argued in Praeambula Fidei: Thomism and the God of the Philosophers, Gilson’s position, like de Lubac’s, threatens to undermine the traditional Thomistic view that philosophy must be clearly distinguished from theology and can arrive at knowledge of God apart from revelation.  Such views thereby “unwittingly [erode] the notion of praeambula fidei” and “lead us along paths that end in something akin to fideism” (p. ix).  

McInerny’s book, along with other recent works like Lawrence Feingold’s The Natural Desire to See God according to St. Thomas Aquinas and His Interpreters and Steven A. Long’s Natura Pura, mark a long-overdue recovery within mainstream Catholic thought of an understanding of nature and grace that was once common coin, and apart from which the possibility of natural theology and natural law cannot properly be understood.  Nor, I would say, can other crucial matters properly be understood apart from it (such as original sin, as I argue in the post linked to above).  The blurring of the natural and the supernatural may also lie behind a tendency in some contemporary Catholic writing to overemphasize the distinctively theological aspects of some moral issues.  For example, an exposition of traditional sexual morality that appeals primarily to the Book of Genesis, the analogy of Christ’s love for the Church, or the relationship between the Persons of the Trinity may seem more profound than an appeal to (say) the natural end of our sexual faculties.  But the result of such a lopsided theological emphasis is that to the non-believer, Catholic morality can (again to use Bruce Charlton’s words) falsely “seem to rely on diktat of scripture and the Church” and thus appeal only to the relatively “tiny, shrinking realm” of those willing to accept such diktat.  It will fail adequately to explain to those who do not already accept the biblical presuppositions of Pope John Paul II’s “theology of the body” or of a “covenant theology of human sexuality,” their merits notwithstanding, exactly how Catholic teaching is rationally grounded in human nature rather than in arbitrary divine or ecclesiastical command.  Grace doesn’t replace nature but builds on it; and an account which heavily emphasizes the former over the latter is bound to seem ungrounded.

The late pope himself realized this, whether or not all of his expositors do.  In Memory and Identity he says:

If we wish to speak rationally about good and evil, we have to return to Saint Thomas Aquinas, that is, to the philosophy of being [i.e. traditional metaphysics].  With the phenomenological method, for example, we can study experiences of morality, religion, or simply what it is to be human, and draw from them a significant enrichment of our knowledge.  Yet we must not forget that all these analyses implicitly presuppose the reality of the Absolute Being and also the reality of being human, that is, being a creature.  If we do not set out from such “realist” presuppositions, we end up in a vacuum. (p. 12)

And in Chapter V of Fides et Ratio he warned:

There are also signs [today] of a resurgence of fideism, which fails to recognize the importance of rational knowledge and philosophical discourse for the understanding of faith, indeed for the very possibility of belief in God.  One currently widespread symptom of this fideistic tendency is a “biblicism” which tends to make the reading and exegesis of Sacred Scripture the sole criterion of truth…

Other modes of latent fideism appear in the scant consideration accorded to speculative theology, and in disdain for the classical philosophy from which the terms of both the understanding of faith and the actual formulation of dogma have been drawn.  My revered Predecessor Pope Pius XII warned against such neglect of the philosophical tradition and against abandonment of the traditional terminology.

And the Catechism promulgated by Pope John Paul II, citing Pius XII, affirmed that:

human reason is, strictly speaking, truly capable by its own natural power and light of attaining to a true and certain knowledge of the one personal God, who watches over and controls the world by his providence, and of the natural law written in our hearts by the Creator… (par 37)

There is a reason why the first Vatican Council, while insisting that divine revelation teaches us things that cannot be known by natural reason alone, also taught that:

The same Holy mother Church holds and teaches that God, the source and end of all things, can be known with certainty from the consideration of created things, by the natural power of human reason…

and

Not only can faith and reason never be at odds with one another but they mutually support each other, for on the one hand right reason established the foundations of the faith and, illuminated by its light, develops the science of divine things…

and

If anyone says that the one, true God, our creator and lord, cannot be known with certainty from the things that have been made, by the natural light of human reason: let him be anathema.

and

If anyone says that divine revelation cannot be made credible by external signs, and that therefore men and women ought to be moved to faith only by each one's internal experience or private inspiration: let him be anathema.

and

If anyone says… that miracles can never be known with certainty, nor can the divine origin of the Christian religion be proved from them: let him be anathema.

The point of such anathemas is not to settle by fiat the question of whether God exists or whether miracles have actually occurred; obviously, a skeptic will be moved, if at all, only by being given actual arguments for these claims, not by the mere insistence that there are such arguments.  The anathemas are directed at the fideistic, subjectivist Christian who would dismiss the atheist’s demand that faith be given an objective, rational defense, and who thereby makes of Christianity a laughingstock.  Preaching Christianity to skeptics without first setting out the praeambula fidei, and then complaining when they don’t accept it, is like yelling in English at someone who only speaks Chinese, and then dismissing him as a fool when he doesn’t understand you.  In both cases, while there is certainly a fool in the picture, it isn’t the listener.

Friday, January 13, 2012

I post this article not for its specific topic of an anticipated Anglican migration into the Roman Catholic Church, but for the writer's critique of modernity with respect to his topic, from which he has admittedly wandered a little far afield. His critique is essentially my own in "The Church and the Culture of Modernity".

http://www.communio-icr.com/articles/PDF/hanby36-3.pdf

Friday, September 9, 2011

While this blog is in the main interested in the root causes of modernity, this piece about its profound effects was too good not to pass on.

The Rise of Localist Politics 
Until very recently, the centralization of administrative power under expert control — what we might call, for shorthand, rational planning — was considered essential to public policy solutions. In the industrial and post-industrial eras, advances in science and technology seemed to promise a future of unprecedented efficiency. Centralized programs could coordinate masses of people toward desired goals, in areas from government to business to philanthropy to city planning. Modern policy problems were considered to be, fundamentally, systemic issues too complex for local citizens and requiring expert professional attention. Technology and globalization would only increase the value of this approach.
Now, however, trends have begun to shift in a very different direction. Some of the preeminent projects of rational planning are foundering or altogether failing. The entitlement crisis, the housing bubble, and other prominent stories and scandals have made Americans more skeptical of distant experts. Advances in technology and business have created new possibilities for individual and local empowerment. The pressure is on for products, services, and organizational practices that will enable consumers and participants to solve problems themselves.
By contrast, rational planning viewed human beings mainly in the aggregate, essentially as a collection of data points that could be predicted and manipulated based on such categorical differences as race and gender. The messy web of mediating institutions — families, churches, nonprofits — could be sidestepped. Mass programs, which could operate on a scale impossible in the pre-industrial age, would be able to deal directly with the masses, matching problems with solutions and products with demand. Freed from the complex and sometimes onerous network of relationships formerly required for political life, Americans would interact directly with the powerhouses of finance and planning: the government, major corporations, big foundations, and so on.
This model, it was believed, could be applied across the board. Its most obvious value was in the mass production of goods and services. Top-down, command-and-control business models, replicated identically across the world, would bring ruthless efficiency to the private sector. Corporations would get bigger and bigger, driving material prosperity. And these concepts were applied not just to government and commerce but also to aspects of social life, including city design, which became specialized so that people would live in one place, work in another, shop in another, and play in still another (the invention of the suburb took this model to its logical end). Cities and houses, said French architect Le Corbusier, were “machines for living in.”
But while rational planning allowed for success and efficiency on a greater scale than ever before, it also extended failure and inefficiency to the same scale — and nowhere has this been more obvious than in the political and social sphere. The impending fiscal collapse of the major entitlement programs of the twentieth century signals just what an enormous failure rational planning often proved to be. And “big philanthropy” ran into similar problems as “big government.” Large private foundations like those of Rockefeller and Gates dedicated themselves to wiping out social problems with millions of dollars and professional plans. These foundations have pursued technocratic solutions to such problems as school reform and AIDS in Africa — and they are baffled when, as so often happens, their multibillion-dollar efforts fail miserably. What these failures in government and philanthropy have in common is the idea that whole societies are just “machines for living in.” Experts, the rational planners believed, could descend on a big problem, substitute their theoretical (“scientific”) knowledge for the practical knowledge of the locals, and fix it.
Entire generations in the United States have now grown up in the society the rational planners envisioned, complete with established suburbs, schools, big businesses and foundations, and federal entitlement programs. They live in suburban socioeconomic segregation, and rarely participate in local politics (which has largely become professionalized). Some newer cities, like Houston, were designed by their planners around the car and the TV — not the citizen and the self-governing community. A parent today has good reason to take his family to the suburbs for cheaper housing and better schools, a low-income citizen has every incentive to collect a government welfare check, and neither has any clear reason to participate in politics except to lobby the bureaucracy to maintain his status quo. The experts will take care of the rest.
Yet over the course of a century, human experience has not validated the rational planning assumption — and a response is coming, if the rising generation is any indication. The people who grew up under the realized model of the rationally planned society are increasingly inclined to shrug it off. Rational planning seems to have created a demand for precisely the things it required people to give up. People who have grown up this way — particularly young people now in their teens, twenties, and early thirties — feel isolated and long for a sense of place. They want to make a difference, not in mass organizations or abstract causes, but in connections and relationships close to home. Where their parents protested, these young people volunteer. They often find their first taste of community life in college, where they live, work, and play in the same environment, and can participate in the community by choosing from among the hundreds of student groups and activities on offer. A 2010 study at the University of Northern Colorado found that students who were involved in at least one campus organization considered the university to be a community; those who weren’t involved did not. In short, it seems that to feel connected to the big, they need to be active in the small.
Forward-thinking CEOs, looking to hire these young people, are structuring their companies accordingly. The cutting-edge companies of today still use metrics and scientific techniques of the sort that characterized the rational planning era, but they are also seeking to develop a more place-centered, organic approach. The simple reason: command-and-control can solve some problems, but often creates others — chief among them the corporate ignorance fostered by a lack of on-the-ground expertise. The Prelude Corporation, at one time the largest lobster producer in North America, tried rational planning — and discovered (too late to save itself) that lobster fishing relies heavily on local knowledge. GM and Chrysler, bloated beyond the control of their centralized management, needed federal bailouts in 2009.
By contrast, Ford is on the upswing after making aggressive changes to allow its teams the freedom to innovate. In 2008, the management of Starbucks realized it had started to obsess over mass production and growth, and gotten away from what made its company work — small teams dedicated to making good coffee. Rather than the top-down hierarchical strategy of directed control, companies like these are developing organizational cultures manifested through smaller networks in which local knowledge matters; they emphasize getting the best out of a team rather than micromanaging and bossing it around. The organizations that have made these adjustments — or were founded based upon them, such as Apple, Amazon, and Google — are reporting higher job satisfaction, faster innovation, and greater profits than organizations still laboring under the old methods.

Civic Life, Politics, and Place
This “localist” trend is beginning to reshape American politics as well. Among its other flaws, the rational planning model was based on the mistaken notion that science could be substituted for the practical knowledge of ordinary citizens. But the social sciences have simply never come close to approaching the physical sciences in their explanatory or predictive power. They cannot grasp or manage some of the most basic variables in public policy, including the human need for ownership over our stake in society — that is, the needs for belonging and participation. As a 2009 report for the James Irvine Foundation puts it, people “want the opportunity to be more than passive audience members whose social activism is limited to writing a check.” And as Robert Putnam, author of Bowling Alone (2000), has documented, communities whose citizens feel a sense of local empowerment report (among other things) better local government, less crime, and faster economic growth. Many citizens are more inclined to participate even in the most basic act of civic life — voting — when a particular issue seems to directly affect them, and they are convinced they can affect it back.
American cities are catching on to the change. Whether in city design or problem-solving, more and more municipalities are trying solutions that involve multifaceted participation, as documented in a 2009 report for Philanthropy for Active Civic Engagement. Some, like Rochester, New York, have sought to improve their local governance by instituting neighborhood councils; this allows people to relate to the city from the vantage point of a smaller political unit they can see and understand firsthand. Boston, with its “Complete Streets” project, is experimenting with “new urbanism,” a mixed-use method of city design that makes neighborhoods more self-sufficient and friendly to social interaction. And cities from Colorado Springs to St. Petersburg, Florida are making headway against social problems through public-private partnerships and a thriving nonprofit sector. For public policy at the national or state level to succeed, it increasingly appears that it must find ways to empower, rather than hinder, local self-government — and in doing so, it has to resist the temptation to micromanage from afar.
The move toward localism is driven by expediency more than ideology. Cities, businesses, and other organizations are instituting place-centered practices not because of identification with a movement or theory, but because they are finding that a more organic approach just plain works better. Doing things the “messy” way often proves more effective in the long run.
This shift makes electoral politics trickier, too: muddling through and finding messy local solutions is harder to sell to the public than a grand, oversimplified vision. Tougher still is encouraging localism while refraining from excessive intervention. But it is possible that the leader or party who embraces the localist approach, who articulates the ideas underlying it, who treats communities of engaged citizens as if they matter, may actually have the opportunity to sell it as a grand vision — to make it a movement.

Localist Politics, Left and Right
Much of the recent rise of localism has come from the left, from foodie and environmental efforts on the cultural side, to the extensive use of social media to mobilize community activists on the political side. Actually, localist rhetoric has existed on the “New Left” since the 1960s, when radicals like Saul Alinsky argued that rational planning left out the importance of community organization and local leadership. Fundamentally, however, even the New Left did not abandon the left’s longstanding preference for rational planning with its emphasis on people in the aggregate — that is, in masses. “People are the stuff that makes up the dream of democracy,” argued Alinsky in Reveille for Radicals (1946). He shared the old left’s view of people as masses in categories and wanted to mobilize the groups for larger political goals. The modern new left has not departed from that mindset; only its preferred method is different.
The standard-bearer for the modern left is, of course, Alinsky’s intellectual descendant Barack Obama, the community organizer whose 2008 presidential campaign is a useful case study in the irony of localism on the political left. His rallying cry was an appeal to ordinary citizens to get involved, to serve in our communities, to be the change that we’d been waiting for. This rhetoric may have been vague, but it was certainly not a call for big, centralized government. But localism and community service are not what the Obama administration has focused on or will likely be most remembered for.
It is hard, first of all, to find examples of Obama administration initiatives for community service that are significantly different from those of the Bush administration. More notably, President Obama has overseen an explosion in the size of the federal government, even beyond the controversial bailouts that at least had the (arguable) justification of averting a depression. Most significantly, his administration has delivered on another of his central campaign promises: the passage of his 2,700-page health care bill, which, along with Franklin D. Roosevelt’s New Deal and Lyndon B. Johnson’s Great Society, completes the great trifecta of the twentieth-century liberal vision of central administration. Despite the different packaging, the left and its leaders are still champions of rational planning, and they refuse to see the problems that arise from it as anything but evidence of need for further expert tweaking. It would be too difficult otherwise to admit the irredeemable failures that have arisen from a century of their governing philosophy.
But where do conservatives stand on questions of localism in American political life? While the rhetoric of the right has vehemently opposed the progressive faith in rational planning, the right has actually implicitly joined the left in its acceptance of the old paradigm: political life is characterized by individuals in the aggregate. The difference is that whereas the left emphasized the aggregate, the right has emphasized the individual. There were sound historical reasons for this orientation: conservative heroes such as Goldwater, Reagan, and their intellectual successors were fighting a battle against collectivism at home and abroad, protecting the individual against the heavy press of the group. But the resulting strong libertarian streak has led much of the right to blindly disdain all government, including the crucial institutions of local self-government. Likewise, the backlash against the Obama administration’s big-government efforts — evident especially in the Tea Party movement — has come in the form of appeals to individual liberty rather than calls for local self-government, stronger communities, and responsible citizenship.
A nation’s ability to have a “small” national government depends on its ability to foster strong civic life on the local level. For example, the federal government’s direct role in fighting homelessness has noticeably decreased over the past decade, as public-private partnerships between city governments and local nonprofits have proven more effective than the federal failures of the Great Society. The city of Denver, under a Democratic mayor (now governor of Colorado), reduced its chronically homeless population by over 60 percent in four years on the strength of strategic partnerships with faith-based nonprofits. Although these efforts sound like they would appeal to conservatives, few have been promulgated or picked up by the GOP. While Republicans may be willing to challenge rational planning from time to time, they have largely been unable to recognize that they are arguing on the old paradigm’s playing field — they are defending their own end zone, but not suggesting a different sport, responding to centralization and isolation by maintaining the individual’s right to be isolated.
Meanwhile, a small “new right” has begun to emerge: more localist than nationalist, more Burke than Hayek, and fairly amicable with the New Left (many of its members are not Republicans). In Britain, a similar coterie has gained significant political influence with the ascendancy of Prime Minister David Cameron and his “Big Society,” encouraging people to get involved in their communities instead of relying on the government for services. In the United States, it is mainly comprised of offbeat academics, has few formal organizations, and has the Internet for its main intellectual outlet, on sites such as FrontPorchRepublic.com — an opinion source for so-called “crunchy cons,” as Rod Dreher called them in his book of the same name. “There are hopeful signs that people are beginning to think seriously about the importance of localism, human scale, limits, and stewardship, the very things woefully lacking in the current spending orgy,” writes Mark Mitchell, a professor at Patrick Henry College and a regular contributor to FrontPorchRepublic.com. “While a return to these ideals is still only in its infancy, change is afoot. This represents a glimmer of sanity in a world succumbing to the apparent security promised by centralization.”
But overall, the new right is still at a theoretical stage: its adherents rarely offer specific policy proposals, and too frequently, its ideas are unspecific or unrealistic. This new right has little political influence and no organized strategy. But, like the right in general, it has devoted a great deal of thought to foundational ideas from which specific policies could be developed.
While localism has so far been a movement mostly on the left, it seems ripe for the right to take it up as its own. Indeed, it remains difficult to fully reconcile localism with the left’s remaining adherence to centralized government and rational planning. Localism is philosophically more at home on the right: at the heart of conservatism is a belief in the value of relationships, self-government, and local institutions. It is high time for the right to put the policy together with the principles. We are, after all, the change that we’ve been waiting for.


Brian Brown is the director of the Forum for Civic Advancement at El Pomar Foundation in Colorado. This essay represents the views of the author and not necessarily those of the foundation.

Tuesday, August 23, 2011

This is a piece by Thaddeus Kozinski on religious pluralism and the confessional state. It serves as a nice precis of his excellent book entitled, "The Political Problem of Religious Pluralism".

The Good, the Right, and Theology

Thaddeus J. Kozinski


T
heology has been a curious non-interlocutor in most public debates among conservative theists regarding how best to defend the objectivity, intelligibility, and communicability of moral truths and their application to contemporary legal issues, such as racial discrimination, human rights, and abortion. One such debate occurred recently on the pages of On the Square and Public Interest. (For a summary of and commentary on the debate, see Micah Watson’s A Tale of Two Philosophers). The main issue of the debate was not the content of basic moral principles, but their epistemological, ontological, and rhetorical aspects: the fundamental structure of moral thinking and judgment, its relation to what precisely is being thought about and judged, and the most reasonable and effective mode of public ethical and legal discourse. The two interlocutors agreed “that the source of morality is human nature, that human nature is essentially a rational nature, and that moral truths are discoverable through reason apart from revelation,” and they both condemned the moral evil of racial discrimination. What they were at odds about is exactly why this or any evil act is evil, and what makes an act good and a moral principle true. The question comes down to the precise ontic and epistemic character of “ought.”
For Arkes, racial discrimination is a big “ought not” because—and only because—it isunreasonable; it is an act that violates a knowable and known principle of reason, that humans have moral status and dignity by virtue of what makes them human, namely rationality and freedom. Thus, treating a human being as less than human based upon what does not define them essentially as human, such as skin color, is unreasonable and therefore wrong. And since reason is ultimately anchored in the law of non-contradiction, racial discrimination is evil because it violates this most fundamental and self-evident law of human reason. If a human being is properly defined as possessing an essential equality with all other humans, then it is contradictory to commit racial discrimination, since it entails one human being with this essential equality treating another human being as not having this equality. Thus, legal proscriptions against this evil practice, for Arkes, should be explicitly grounded in and justified by just this sort of explanation. O’Brien, on the other hand, identifies Arkes’ characterization of the location, derivation, and justification of moral knowledge as essentially Kantian and therefore problematic: "To have substance, morality needs to go beyond mere rational consistency and find its grounds in the form of ‘rational animality,’ as Aristotle and Aquinas saw, but which Kant mistakenly rejected as ‘heteronomous.’” For O’Brien, moral evil is not evil primarily because it is and is seen to be self-evidently unreasonable in light of some sort of a priori, abstract conception of the rational being as such, but because it is and is seen to be vicious in light of concrete, personal, historical, tradition-constituted, community-informedexperience, in terms of a conception of human flourishing and happiness that answers not so much the question why one ought to do this or that, but what we,qua-members-of-this-community-and-tradition, need in order to live and live well.
As it seems to me, this debate is a scuffle in an ongoing human feud, begun back in the wranglings between the ancient Stoics and Epicureans. It is a war between “two rival versions of moral enquiry,” to use MacIntyre’s expression, eudaimonism and deontologism: an ethics of happiness, flourishing, virtues, eros, and the good, versus an ethics of self-sacrifice, duty, law, agape, and the right. This feud is not going to end any time soon, at least not without some mediation, by a third, peace-making interlocutor.
As I said at the outset, theology, unlike in the ancient debates, has not been an interlocutor in this and virtually all other academic and public discussions of ethics and politics. Sure, the theologian is allowed to have his say, but he is barred from ever having an authoritative say, from being one of those insiders whose deliberations and speculations are to become an integral part of “public reason.” The theologians have a quite compelling story, the philosophers and public policy folks admit, but we need a story more appropriate, more “true,” for our pluralistic, secular, political culture. However, when dealing with the foundations of ethics, the Christian theologian’s story is not just one story among others—it is one that must be read by everyone, for it is meant for everyone. It is ultimately everyone’s story. Moreover, as Radical Orthodoxy has shown, the ostensibly a-theological, secular stories that automatically pass the muster of public reason are nothing if not theologically implicated, even if only implicitly. Now, although the Christian story is everyone’s story, only a very select audience has heard it in its entirety, believed it fully, and made it a model for their own life-stories. Yet, even for the unbeliever, the theologian’s story has clear and arguable logical, ethical, philosophical, legal and political ramifications and components, just as the “non-theological” stories have implicit yet robust theological moorings. Let those who have ears—that is, those who have taken out their old and decrepit, modernist, Enlightenment earplugs—hear: “We are all theologians now.”
The inseparability of faith and reason, in both theory and practice, is one of the main points of Benedict XVI’s encyclical teachings. We can debate the political and philosophical ramifications of the affirmation that we are made in the image of God, that God loves us, and that He commands us to “be perfect as His father in heaven is perfect”; however, in the end, we either affirm these truths or we do not, based upon whether we have or have not encountered the living Christ, caritas in veritate, or perhaps just encountered those Christians who have. So, if human acts are a matter of experience, choice, and grace—not just logic, evidence, and demonstration, whether Aristotelian-eudaimonistic or Kantian-deontological in mode—then any debate about the metaphysical, epistemic, and rhetorical aspects of ethics must invite theology as an interlocutor. And this neglect of theology is the reason that the debate between Arkes and O’Brien is, as it stands, irresolvable.
The problem is that they are both right. O’Brien is correct that arguments about and declarations of principled moral prescriptions and proscriptions, even rigorous and true ones, cannot ensure a public commitment to and embodiment of Christian or even humanistic values in our post-Enlightenment, neo-pagan, pluralistic political culture. Moral principles are experiential, cultural, and historical in their genealogy and in the subjective apparatus of human recognition. But Arkes is right that we can and must transcend these contingencies to see and act on principles in an absolute, universal, and eternal way. In other words, although reason is tradition-dependent (pace Kant), it is also tradition-transcendent (cum Kant). Somehow we must hold these together, and I don’t think we can outside of a theological narrative and discourse.
And the problem is that they are both wrong. Western nation-states lack a shared intellectual tradition to provide grounding for the abstract meaning of universal, human rights and moral values. They also lack a communally shared ethos, which is required for the effective, authentic, and integral political and legal embodiment of rights and values. As O’Brien’s argument suggests, the discourse-of-moral-principle-alone, in prescinding from experiential genealogy and a moderate historicist sensibility, is ultimately sterile. Public reason in today’s secular culture mistakenly eschews any theological dogma that might shed authoritative light on the ultimate meaning, derivation, and fulfillment of human life and experience. On the other hand, as Arkes maintains, a discourse-of-moral-experience-alone absent the universal, history-and-experience transcending logos is ultimately indeterminate, for it is sub-rational. The right and the good must live together or die alone.
Here MacIntyre sums up what he considers the essential problem with a natural-law morality and argumentation that tries to transcend contingency and experience. MacInytre is critiquing Maritain’s “democratic charter,” where natural-law norms, not religious or philosophical particularity, are the bases for political consensus:
What Maritain wished to affirm was a modern version of Aquinas’ thesis that every human being has within him or herself a natural knowledge of divine law and hence of what every human being owes to every other human being. The plain pre-philosophical person is always a person of sufficient moral capacities. But what Maritain failed to reckon with adequately was the fact that in many cultures and notably in that of modernity plain persons are misled into giving moral expression to those capacities through assent to false philosophical theories. So it has been since the eighteenth century with assent to a conception of rights alien to and absent from Aquinas’ thought.1
According to this view, Arkes's model would be analogous to Maritain’s and so not sufficiently aware of the fact that—while men may argue and think about moral truth, and value and pursue moral goods without conscious deference to a particular philosophical theory or religious belief—they nevertheless possess implicit and unconscious philosophical commitments that influence and condition the character and interpretation of that evaluation and pursuit. These commitments determine to some extent the character of behavior that is the conclusion of the practical reasoning that begins with the evaluation and pursuit of a particular good. Since rationality itself is a practice, the former inevitably takes the shape of the particular lived tradition of which it is a part. In practice, then, there is no rationality as such, but only particular rationalities informed by particular religious, philosophical, anthropological, and epistemological commitments that condition the manner in which that rationality is applied to practical questions. Therefore, with citizens divided in traditional allegiance, one should not expect rational agreement on practical matters of a moral nature, especially not on the foundational moral values of the political order. As MacIntyre argues in Whose Justice? Which Rationality?: "There is no way to engage with or to evaluate rationally the theses advanced in contemporary form by some particular tradition except in terms of which are framed with an eye to the specific character and history of that tradition on the one hand and the specific character and history of the particular individual or individuals on the other."2
For MacIntyre, a strictly principled, obligation-laden, logic-derived articulation of moral goods and rights cannot serve as the political foundation of a tradition-pluralistic regime. For we are “tradition-constituted, culturally dependent rational animals” that cannot effectively separate our beliefs from our values and the actions derived from them. Though the citizens in a pluralistic polity may share a common lexicon of “human rights” and “democratic values,” in reality, it is a house built on sand with a sinking foundation of entirely disparate understandings of that lexicon and radically disparate traditions of practical rationality: Thomist, Humean, Kantian, Rousseauian, Nietzchean, Deweyean, et. al. For MacIntyre, shared moral evaluation and understanding is extremely limited, if not impossible altogether, in the absence of a shared tradition of practical rationality, including a common reservoir of theological, philosophical, ethical, and anthropological concepts, and common virtues and goods attained in and through the various practices—especially the architectonic practice of politics—that constitute a shared tradition. This is why we have so much moral disagreement in our public discourse. Tracey Rowland describes MacIntyre’s position: “Macintyre’s analysis raises the question of whether there can be any such things as ‘universal values,’ understood not in a natural law sense, but rather…the idea that there is a set of values which are of general appeal across a range of traditions, including the Nietzschean, Thomist, and Liberal traditions.” MacIntyre again:
Abstract from the particular theses to be debated and evaluated from their contexts within traditions of enquiry and then attempt to debate and evaluate them in terms of their rational justifiability to any rational person, to individuals conceived as abstracted from their particularities of character, history, and circumstance, and you will thereby make the kind of rational dialogue which could move through argumentative evaluation to the rational acceptance of rejection of a tradition of enquiry effectively impossible. Yet it is just such abstraction in respect of both of the theses to be debated and the persons to be engaged in the debate which is enforced in the public forms of enquiry and debate in modern liberal culture, thus for the most part effectively precluding the voices of tradition outside liberalism from being heard.3
But let us suppose it is true that citizens belonging to the same narrative tradition would form a more unified, robust, stable and strong political order, so that exceptionless and self-evident rights and laws deriving ultimately from the law of non-contradiction and man’s obvious end-in-himself dignity, would serve as the most effective public discourse. Unfortunately, the demographic and sociological exigencies of the modern, pluralistic nation state preclude such narrative unity. We cannot have forced conversions to our narrative of choice, and so we must accept the limitations of our “concrete historical ideal,” as Maritain would say: the fact of religious pluralism requires us to attempt, even if it seems impossible, the separation of the public, legal, political sphere from the particularity of our traditions. But can such be done? Is this kind of acquired schizophrenia necessary to be a good pluralist citizen?
Conservative theists endorse wholeheartedly the infusion of integrally religious practices and discourse into the naked public square; yet they also tend to limit the participation in and scope of these practices and discourses to the in-house crowd, as it were. For those outside their tradition, and for the secular public sphere in general, a program of translation— a translation of dogma, ritual, charitable acts, and especially the natural law. It is urged to speak only the language of principled, universal “public reason” to strangers, thereby secularizing, moralizing, and politicizing what is distinctly theological and spiritual in our tradition, both in doctrine and in practice, to render it intelligible to non-theists and practically effective for secular society.
However, this strategy presupposes two fundamental ideas that need to be reexamined. The first is that there is such a thing as the “secular,” that is, an ideologically neutral, universal, public world accessible to and based upon a universal public reason, abstracted from the practical and speculative particularities of tradition. However, if not, if there is no objective, public reason, then it would seem that all we are left with are the postmodernist hermeneutics of suspicion, or the will to power, where any affirmation of true or good is unmasked as either mere idiosyncrasy or the will to dominate. The second idea that must be reconsidered is the easy separability of theoria and praxis, the confidence that one can effectively strain out from the concrete practices and particularist discourse of one’s tradition a secular, universally accessible remainder that is intelligible to all regardless of traditional allegiance.
Regarding the existence of a secular reason or public space neutral to any particular tradition, MacIntyre writes:
Either reason is thus impersonal, universal, and disinterested or it is the unwitting representative of particular interests, masking their drive to power by its false pretensions to neutrality and disinterestedness. What this alternative conceals from view is a third possibility, the possibility that reason can only move towards being genuinely universal and impersonal insofar as it is neither neutral nor disinterested, that membership in a particular type of moral community, one from which fundamental dissent has to be excluded, is a condition for genuinely rational enquiry and more especially for moral and theological enquiry.4
For MacIntyre, as well as, I think, for O’Brien, it is only through active participation in particular authentic traditions that men are rendered capable of discovering and achieving their ultimate good. For it is always through a particular tradition that we ascend to universal truth. Indeed, without tradition we are unable to make any sense of reality at all, because our bodies, minds, and souls are, largely, products of tradition themselves. As body and soul composites, our encounters with reality are mediated by bodies, which are themselves mediated by history and culture. Even the words and concepts we use to interpret and make sense of the brute facts of reality originate and develop in what MacIntyre calls “traditions of rationality.” All men are necessarily habituated into a particular tradition, even if it is an incoherent and considerably defective one like the tradition of liberalism. Outside of tradition, coherent knowledge and discovery of the good is practically impossible. We are, in MacIntyre’s improvement on Aristotle’s classic definition, “tradition-dependent rational animals.” As Paul Griffiths puts it: “To be confessional is simply to be open about one’s historical and religious locatedness, one’s specificity, and openness that is essential for serious theological work and indeed for any serious intellectual work that is not in thrall to the myth of the disembodied and unlocated scholarly intellect.”5
Regarding the capacity to translate particular religious truth into non-religious public reason, MacIntyre articulates what can be called the traditionalist dilemma:
The theologian begins from orthodoxy, but the orthodoxy which has been learnt from Kierkegaard and Barth becomes too easily a closed circle, in which believer speaks only to believer, in which all human content is concealed. Turning aside from this arid in-group theology, the most perceptive theologians wish to translate what they have to say to an atheistic world. But they are doomed to one of two failures. Either [a] they succeed in their translation: in which case what they find themselves saying has been turned into the atheism of their hearers. Or [b] they fail in their translation: in which case no one hears what they have to say but themselves.6
Is there a solution to this dilemma? Is there a resolution between Arkes and O’Brien, between eudaimonism and deontologism? If there is, the indispensable condition for its realization, I think, is the recognition of the illusory nature of secularist liberal pluralism. Indeed, there is really no such thing as “liberalism,” if this means a sphere of reason or action that escapes the particularism and exclusivity of tradition. And there is also no such thing as “the secular” since traditions of rationality are distinguished by the particular way they grapple with matters of ultimate concern—all traditions are ultimately religious. This has great political implications. David Schindler writes: “A nonconfessional state is not logically possible, in the one real order of history. The state cannot finally avoid affirming, in the matter of religion, a priority of either ‘freedom from’ or ‘freedom for’—both of these priorities implying a theology.”7
If believing theists of diverse traditions do not think, speak, and act distinctively as Catholics, Protestants, Jews, and Muslims—bringing their intellectual, moral, and liturgical traditions wherever they go in imitation of Socrates, whom Catherine Pickstock calls a “walking liturgy,” then our “ecumenical jihad” stands no chance at converting the “liberal traditionalists” of the culture of death, who have no qualms about communicating to themselves and others exclusively in their religious parlance of tolerance and diversity, and inviting all into their liturgical practices of abortion, same-sex marriage, and euthanasia. Indeed, they see themselves as the “true believers,” the only ones truly defending “life,” with us as the heretics, obsessed only with death and control.
How can these deluded devotees have any hope of ever renouncing their enslaving tradition unless they are made aware of its enslaving character? And how can they become aware unless they have some palpable experience of an alternative? The tradition they inhabit deprives them of the existential conditions required to see moral truths, let alone religious ones, as Tristram Englehardt has pointed out: “In the grip of Enlightenment dispositions regarding religion, few are inclined to recognize that the moral life once disengaged from a culture of worship loses its grasp on the moral premises that rightly direct our lives and foreclose the culture of death.”8 D. Stephen Long puts the whole point powerfully:
Beginning with the flesh of Jesus and its presence in the church, theology alone can give due order to other social formations—family, market, and state. The goodness of God is discovered not in abstract speculation, but in a life oriented toward God that creates particular practices that require the privileging of certain social institutions above others. The goodness of God can be discovered only when the church is the social institution rendering intelligible our lives. . . . For a Christian account of this good, the church is the social formation that orders all others. If the church is not the church, the state, the family, and the market will not know their own true nature.9
Moral judgments are certainly principled judgments, and we should search for and declare these principles, even enforce them in law. Yet, all principles of reason, whether moral or logical, are first and foremost expressions of the divine logos, who can be encountered in and through his manifold, principled, universal expressions, but absent a personal, experiential encounter with Him through Faith, in the very particular place and time where His Flesh becomes available to touch and experience, principles are just principles—fleshless, bloodless, and dead.

Thaddeus J. Kozinski is Assistant Professor of Humanities and Philosophy at Wyoming Catholic College.

1. Alasdair MacIntyre, Three Rival Versions of Moral Enquiry: Encyclopaedia, Genealogy, and Tradition(Notre Dame: University of Notre Dame Press, 1990), 76. [back]
2. Alasdair MacIntyre, Whose Justice? Which Rationality? (Notre Dame: University of Notre Dame Press, 1988), 398. [back]
3. MacIntyre, Whose Justice? Which Rationality?, 399. [back]
4. MacIntyre, Three Rival Versions of Moral Enquiry: Encyclopaedia, Genealogy, and Tradition, 59.[back]
5. Paul J. Griffiths, “The Uniqueness of Christian Doctrine Defended” in Christian Uniqueness Reconsidered: The Myth of a Pluralistic Theology of Religions, ed. Gavin D’Costa (Maryknoll, NY: Orbis Books, 1996 ), 169. [back]
6. Alasdair MacIntyre, Against the Self-Images of the Age (New York: Schocken Books, 1971), 19-20.[back]
7. David Schindler, Heart of the World, Center of the Church (Grand Rapids, MI: William B. Eerdmans and T&T Clark, 1996), 83. [back]
8. H. Tristram Engelhardt, Jr., “Life & Dearth after Christendom: The Moralization of Religion & the Culture of Death,” Touchstone (June, 2001), accessed on June 21, 2007; available fromhttp://www.touchstonemag.com/archives/article.php?id=14-05-018-f. [back]
9. D. Stephen Long, The Goodness of God: Theology the Church and Social Order (Grand Rapids: Brazos Press, 2001), 26, 28.


Wednesday, August 17, 2011

I think you'll like this piece by Joseph Baldachinno I took from the Front Porch Republic, one of my favorite sites. Baldachinno does a good job of explaining why the problem with conservatism is not electoral or political in any other sense but moral and philosophical.

In the wake of the 2008 elections the Republican Party looked to be on its last legs. Not only had Barack Obama triumphed in the presidential race, picking up the electoral votes of such previously “red” states as Virginia, North Carolina, and Florida, but the Democrats had widened the majorities they had gained while taking over both houses of Congress two years earlier. Flush with victory, the Democrats, perhaps understandably, interpreted the 2008 election returns as a mandate for their “progressive” policy agenda, which they proceeded to enact into law with gusto, helping in the process to increase the total public debt outstanding from $10.6 trillion on Inauguration Day 2009 to $13.6 trillion a scant 22 months later.
Then came the mid-term elections of 2010, and the liberal ideological consensus that had seemed so palpable turned out to have been a mirage. Not only did the GOP garner the biggest mid-term gain in House seats achieved by either party since 1938, winning 56 percent of the 435 seats in contention, but the GOP also won an even larger 65 percent of this year’s thirty-seven Senate races. Perhaps even more impressive were Republican gains in the state houses, where they are poised to dominate the congressional redistricting process for the coming decade by controlling 29 of the 50 state governorships and at least 57 of the 99 state legislative chambers.
Will the apparent mandate for a pronounced rightward turn in matters of public policy prove any more lasting or substantial than the one in favor of progressivism that went a-glimmering in the 2010 election? If recent American history is any guide, the answer to this question is: Not very likely. Consider the elections of the past 30 years.
Certainly, 1980 seemed at the time to signal a sea-change in the nation’s ideological allegiances. Not only did Ronald Reagan, the undisputed leader of the conservative movement, sweep to victory over the liberal Democratic White House incumbent, Jimmy Carter, but he also brought in on his coattails Republican control of the Senate, marking the first time the GOP had won a majority of either congressional chamber since 1952. The Democrats, who had controlled the House consistently since 1954, resumed control of the Senate in 1986.
The next significant change occurred in 1992 when the Democrats, led by Arkansas Gov. Bill Clinton, regained the White House after a twelve-year absence. A seemingly more seismic shift in the opposite direction came just two years later when Republicans, spearheaded by Rep. Newt Gingrich (Ga.), gained simultaneous control of both the House and Senate for the first time since the election of 1952.
Though Clinton was reelected in 1996, the Republican congressional ascendancy that began in 1994 continued with only a minor interruption until the 2006 off-year election. In that year, as mentioned, the Democrats regained control of the House: a victory that presaged the Democrats’ sweep of the White House and both houses of Congress in 2008.
Based on the foregoing thumbnail history, the political contests that were most worthy of the label “redefining” or “wave” elections during the past three decades occurred, except for that of 2010, at fourteen-year intervals in 1980, 1994, and 2008. It should be noted that in each of these contests the party that triumphed was the beneficiary of disgust in the electorate with the record of the party in power. Reagan’s 1980 election was in large part a reaction to the economic and foreign policy failures of Jimmy Carter, most notably inflation and interest rates in double digits and the Iranian hostage crisis.
In 1994 the Republicans benefited from the Clintons’ overreaching on national health care and from years of entrenched corruption in the Democrat-controlled Congress, exemplified by scandals involving House Speaker Jim Wright (Tex.), who resigned in 1989, and House Ways and Means Committee Chairman Dan Rostenkowski (Ill.), who was forced to relinquish all leadership posts in 1994 before going down to electoral defeat in that same year. By 2008, amidst the worst financial crisis since the Great Depression, even many Republicans were worn down by the George W. Bush Administration’s many domestic and foreign policy lapses, which provided a ready audience for Obama and the Democrats’ siren song of “change.”
On this evidence, neither major party can lay claim to the support of a stable majority either for its espoused policy prescriptions or for demonstrated political competence. Rather, the nation has become polarized between ardent devotees of Fox News on the right and MSNBC on the left. Elections are determined by a group in the middle that oscillates between the two sides to register dissatisfaction whenever the status quo becomes sufficiently difficult to tolerate. If the most recent “wave” election suggests anything new at all, it may be that the oscillations are becoming more frequent and more pronounced.
Yet Republican leaders in Washington, D.C., have assured us in the wake of their 2010 congressional gains that their victory will not lull them into a false sense of security. The GOP, they insist, recognizes that it is on probation. The Democrats won in 2008 because the Bush Administration failed to live up to conservative principles, and the public will turn against the Republicans again if they don’t mend their ways. But this time will be different, they assure us, because Republicans have understood the public’s message, and this time, under the watchful eye of “Tea Party” activists, Republicans will do the public’s bidding.
“Across the country right now,” explained incoming Speaker John Boehner on election night, “we are witnessing a repudiation of Washington, a repudiation of big government, and a repudiation of politicians who refuse to listen to the people, because, for far too long, Washington’s been doing what’s best for Washington, not what’s best for the American people. Tonight, that begins to change.”
How credible is such rhetoric? At first blush it may seem marginally more plausible than the Democrats’ explanation that the voters would have approved their programs if only they had understood them. But, in fact, not only American government but American society in general have grown increasingly dysfunctional over the past half century. Deep down, many serious observers know this, but few, regardless of political persuasion or walk of life, want to face the depressing reality. To do so would require difficult changes in the way we live. Instead of accepting the necessary pain, we are tempted to look away from the actual situation. We create imaginative visions that paint our dominant desires and inclinations in the best light and excuse us from mending our self-indulgent ways.
Barring difficult efforts of will, the human tendency is to pick and choose parts of reality that would justify sticking to our favored mode of existence. We come up with ideas and slogans—even entire ideologies—that present as actual historical reality not the world as it is but the world as we would like it to be, this in order for us to be able to live as we please. So, when politicians wax eloquent about “conservative principles” no less than when they speak glowingly of “progressive ideals,” the question must be asked: Are they addressing the real world in all its complexity or are they presenting an imaginative dream that advances hidden motives?
All humans are more or less prone to hiding inconvenient truths—from others, certainly, but perhaps most significantly from themselves. The reason is ultimately moral laziness. We know only too well our own weaknesses, but we shrink from the hard inner work that morality and happiness require. As Irving Babbitt observed, all humans want to attain happiness on the cheap—to reap the fruits of the spirit without exerting spiritual effort. This tendency toward escapism has become increasingly common in modern Western society. The pre-modern West—heavily influenced by classical and especially Christian culture—taught that man is born with obligations not only to self but to his fellow members of society: in Jesus’ words, to “love thy neighbor as thyself.”
For Aristotle, as for Thomas Aquinas, the purpose of politics and law was to further the common good of society which was shared by all in the sense that it was good for its own sake. Differently put, there is a self in man that is more than individual and higher than mere enlightened self-interest whose nature is to foster genuine community among people. But in the sixteenth century a philosophical and moral revolution began. Encouraged by thinkers such as Bacon, Hobbes, Locke, and Descartes, promotion of the common good was displaced as society’s ultimate purpose by the lesser goal of trying to maximize the satisfaction of conflicting individual and group interests.
Are the Republicans right? Will adhering to “conservative principles” begin to correct the serious problems now besetting American society and thereby provide what is “best for the American people”? Clearly, that depends on what is meant by “conservative principles.” The think tank intellectuals and hired guns are ready with glib answers. Conservatism means “liberty” or “freedom.” It means “limited government.” It means “constitutionalism,” “free markets,” “private property.” But these are general terms, which can each have very different—even opposite—meanings. Whether the mentioned ideas are good or bad depends upon what is meant and the purposes served in each instance.
Traditional conservatives—from Edmund Burke and John Adams in the eighteenth century to Irving Babbitt and Russell Kirk in the twentieth—supported liberty, property, and restraints on government but not as ultimate ends in themselves. They saw them as conducive to efficient production and other commodious arrangements, but most importantly as means to the higher ends of society, which can be summarized in the term “community.”
Contrary to much influential modern thought—Jean-Jacques Rousseau being the most conspicuous example—goodness does not flow spontaneously from human impulses but requires sustained moral effort and supporting cultural and political institutions. Burke recognized the extent to which in England and Europe the latter had been painstakingly developed over centuries. Government, together with other social structures, is necessary to put restraints on actions and desires inimical to man’s higher potential. How much government is needed and what kind cannot be determined in the abstract, but depends on the character of the people of a specific time and place.
For Burke and other traditional conservatives, liberty understood as equally appropriate to all conceivable circumstances is not only irrational but dangerous. Concerning the abstract liberty promoted by the French Jacobins and their supporters, Burke wrote: “I flatter myself that I love a manly, moral, regulated liberty as well as any gentleman . . . . But I cannot . . . give praise or blame to anything which relates to human actions . . . on a simple view of the object, as it stands stripped of every relation, in . . . metaphysical abstraction. . . . Is it because liberty in the abstract may be classed amongst the blessings of mankind, that I am seriously to felicitate a madman, who has escaped from the protecting restraint and wholesome darkness of his cell, on his restoration to the enjoyment of light and liberty? . . .
“I should, therefore,” Burke continued, “suspend my congratulations on the new liberty of France until I was informed how it had been combined with government, with public force, with the discipline and obedience of armies, with the collection of an effective and well-distributed revenue, with morality and religion, with the solidity of property, with peace and order, with civil and social manners. All these (in their way) are good things, too, and without them liberty is not a benefit whilst it lasts, and is not likely to continue long.”
Similarly, John Adams, in an October 18, 1790, letter to his cousin Samuel Adams, wrote: “‘The love of liberty,’ you say, ‘is interwoven in the soul of man.’ So it is, according to La Fontaine, in that of a wolf; and I doubt whether it be much more rational, generous, or social, in one than in the other, until in man it is enlightened by experience, reflection, education, and civil and political institutions.”
In other words, when it becomes common for economic actors, be they janitors or heads of hedge funds, to set aside normal moral and cultural restraints when at work, it will undermine not only the quality of their everyday existence but also damage the honesty and integrity on which a well-functioning market and indeed all civilized life depend. It needs to be understood that in a time of precipitous moral decline freedom may actually become positively destructive of the higher purposes of society. Imagine historical circumstances in which captains of finance have, because of a general moral decline, become unscrupulous, caring little about the welfare of their customers, employees, or society at large. In such a situation, a mentality of unmitigated greed might become pervasive. On the other hand, freedom may become something altogether different where economic and cultural elites embody and expect high standards.
Yet, when the conservative movement so powerful in American politics over the past half century was getting its intellectual start in the 1950s, it became apparent very soon that its participants were profoundly at odds concerning the meaning of freedom, which hinges on the fundamental nature of man and society. Along with Burke and most framers of the American constitution—and in keeping with the pre-modern classical and Christian heritage—conservative academics such as Russell Kirk, Robert Nisbet, and the economist Wilhelm Röpke denounced as reductionism the notion that human beings, who are almost wholly dependent on society for the very attributes that make them human, are ultimately obligated to nothing beyond individual self-interest.
They agreed with Babbitt that freedom, property, constitutional government, and similar rights derive their immense value not primarily from their usefulness to the self-indulgent selves that divide men and women one from another but from their usefulness to the higher or universal self that wills what is good for its own sake and is the basis of community. Indeed, Babbitt held that American liberties owed their very existence to the classical and Christian moral and religious heritage.
But other influential movement founders held the opposite view. Taking sharp issue with the “New Conservatism” of Kirk, Nisbet, Peter Viereck, and others, Frank S. Meyer, who would become a prime architect of the movement, declared sweepingly in a 1955 article that “all value resides in the individual; all social institutions derive their value and, in fact, their very being from individuals and are justified only to the extent that they serve the needs of individuals.” Meyer’s radical individualism, which he attributed in large part to John Stuart Mill, was shared to various degrees by numerous others whose ideas helped shape the early conservative movement, including the economists Ludwig von Mises, Friederich Hayek, and Milton Friedman.
Movement conservatism was thus divided from its beginning on the central issue of man’s moral nature and its relation to politics and liberty. Yet, by the mid-1960s, serious theoretical argument had given way to an ostensible consensus, dubbed “fusionism.” This ideological position, whose leading exponent was Frank Meyer himself, has been summarized as holding that “virtue is the ultimate end of man as man,” but that individual freedom is the “ultimate political end.” Indeed, according to Meyer’s relatively mature, “fusionist” position, the “achievement of virtue” was none of the state’s business, hence not a political question at all.
Despite its label, Meyer’s “fusionism” never achieved a genuine philosophical synthesis of Burkean conservatism and the ideology of classical liberalism or libertarianism. A genuine synthesis would have been impossible, for the two opposing positions are based on contradictory assumptions. For traditional conservatives, the notion that freedom can exist in the absence of moral restraint flies in the face of all historical experience.
Adam Smith, who is widely regarded as the father of economics, noted in The Theory of Moral Sentiments, for example, that “upon the tolerable observance” of such duties as politeness, justice, trust, chastity, and fidelity “depends the very existence of human society, which would crumble into nothing if mankind were not generally impressed with a reverence for these important rules of conduct.” Smith added that social order is not spontaneous or automatic, but is founded on institutions that promote self control, prudence, gratification deferral, respect for the lives and property of others, and some concern for the common good.
Burke, who was an admirer of Smith, similarly wrote: “Men are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites; in proportion as their love of justice is above their rapacity . . . . Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less of it there is within, the more there must be without.” Hence, for traditional conservatism as represented by Burke, by Smith in important respects, and by the American constitutional framers, the advancement of political liberty in any meaningful sense necessarily entails the simultaneous advancement of an ethic of individual restraint and responsibility in support of the common good. Success in the first is impossible without success in the second. To suggest otherwise, according to traditional conservatism, would be absurd.
Yet Meyer’s fusionism does precisely that. He elevates the pursuit of liberty to the highest goal of politics while ignoring freedom’s dependence on moral restraint and its corresponding institutional and cultural supports. True enough, in his overtures for the traditionalists’ support, Meyer pays homage to man’s higher ends, even to religion, yet it is clear from his writings that he remains at a loss concerning what those ends entail. As late as 1962 he was still asserting, for example, the reality of the “rational, volitional, autonomous individual” versus the “myth of society.”
Remove the effects of society on human life for but an hour, a Burke or a Smith would respond to Meyer, and he would recognize soon enough the part of reality he had missed.
A telling measure of morality’s lack of significance in Meyer’s fusionism is that it paralleled the place accorded to religion by many avid secularists: religion is all right as a private matter, but it has no legitimate place in public life. According to Meyer, the constitutional framers shared his preference for separating morality and politics, but this would have come as startling news to George Washington, among others, who said in his Farewell Address: “Of all the dispositions and habits which lead to political prosperity, religion and morality are indispensable supports. . . . [R]eason and experience both forbid us to expect that national morality can prevail in exclusion of religious principle.”
In the end, all that separated Meyer’s fusionist position from libertarianism was the superimposition of a few traditionalist-sounding rhetorical flourishes. In respect to their practical import for how Americans participate in private and public life, the two positions were identical. Such was the considered opinion of the late libertarian scholar and activist Murray N. Rothbard, as expressed in the Fall 1981 issue of Modern Age. Yet, beginning in the mid-1960s, large numbers of Americans who would have been reluctant to embrace libertarianism that was labeled as such found themselves able to do so when it was newly packaged, with the assistance of Meyer and his fusionist allies, as “conservatism.”
As George Nash observed in his 1976 history of American intellectual conservatism, “rather surprisingly, by the mid-1960s the tumult began to subside. Perhaps, as Meyer remarked, the disputants had run out of fresh things to say. Certainly, they had other topics on their mind—the rise of Senator Goldwater, for instance. And, as the dust settled, many conservatives made a common discovery: that Meyer’s fusionism had won. Quietly, with little fanfare, by a process [Meyer] later called, ‘osmosis,’ fusionism became, for most National Review conservatives, a fait accompli.”
What Nash here reports as a victory for fusionism may have been such in practice but certainly not in theory. A major and festering moral and philosophical problem had been swept under the rug. This could happen because those most directly involved had much less interest in philosophical stringency than in issues of practical politics.
Ironically, in the same 1981 issue of Modern Age in which the libertarian Rothbard explained that Meyer’s fusionism was actually libertarianism, Russell Kirk posed the question of what conservatism (of the traditionalist or pre-fusionist variety) and libertarianism have in common. His answer was that, except for sharing “a detestation of collectivism”—an opposition to “the totalist state and the heavy hand of bureaucracy”—conservatives and libertarians have “nothing” in common. “Nor will they ever have,” he added. “To talk of forming a league or coalition between these two is like advocating a union of fire and ice.”
Leveling against libertarianism criticism that could have applied equally to Meyer’s fusionism, Kirk wrote: “The ruinous failing of the ideologues who call themselves libertarians is their fanatic attachment to a simple solitary principle—that is, to the notion of personal freedom as the whole end of the civil social order, and indeed of human existence.” The libertarians, Kirk reported, borrowed whole from John Stuart Mill’s 1859 book On Liberty the principle that “the sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number, is self-protection.”18
As noted previously, fusionism, too, made Mill’s principle sacrosanct, denying any legitimate place in politics for promoting moral restraint. The ability of every individual to act without regard for the common good was elevated to the highest end of conservative politics. All of conservatism’s subsidiary political goals—limited government, free enterprise, private property, minimal taxation—became similarly associated with the unrestrained pursuit of self-interest.
If society is considered less than real, the highest goal for which the individual can strive is to be able to do as he or she pleases to the greatest extent possible. And since doing as he or she pleases is synonymous with freedom by the fusionists’ definition, it follows that, for them in their heart of hearts, there never can be too much liberty or (which is to say the same thing) too little government. To view the world in the light of such broad generalizations discourages subtlety of mind and attention to the needs of actual historical situations. “If you believe in the capitalist system,” Rush Limbaugh explained in a September 2009 television interview, “then you have to erase from your whole worldview what does somebody need. It’s not about need. . . . it is about doing whatever you want to do.”
In contrast with the one-sided emphasis on freedom characteristic of movement conservatism since the 1960s, traditional conservatism views both government and limits on government as necessary responses to man’s flawed moral nature. Because men are not angels, as Madison observed, government is needed to help restrain their passions. But since governments are made of fallible men and not angels, governments also must be limited: “In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself.”
Similarly, Burke instructed: “To make a government requires no great prudence. Settle the seat of power; teach obedience: and the work is done. To give freedom is still more easy. It is not necessary to guide; and only requires to let go the rein. But to form a free government; that is, to temper together these opposite elements of liberty and restraints in one consistent work, requires much thought, deep reflection, a sagacious, powerful, and combining mind.”
Unfortunately, what America has lacked during much of its history and increasingly so is “free government” such as advocated by the framers, Burke, Babbitt, Kirk, and other traditional conservatives. Instead, the tendency has been for political power and the control of government to lurch back and forth between Big Government “progressives” who are prone always and everywhere to “teach obedience” and Small Government “conservatives” (or libertarians) who are prone always and everywhere to “let go the rein.”
Because guided by abstract generalizations rather than historical reality, ideologues of both types are blind to the changing proportions of liberty and restraint appropriate to actual circumstances. The assumption of power by either group, therefore, inevitably heralds trouble. The response of the electorate almost invariably has been to displace one set of rascals with its opposite number only to have the process repeat itself ’ere long.
What about the most recent election? Does the latest shift in favor of “conservative principles” signal a departure from the long-established dysfunctional pattern? To reiterate what was stated tentatively above: The answer depends on what is meant by conservative principles. Almost certainly more dysfunction is on the way. Is there a way to get out of this cycle? One necessary step is to face complex reality and to break the morally and philosophically lazy habits that stand in the way of understanding the prerequisites of liberty.
Some who think of themselves as libertarians may object to the argument here offered that they do recognize that liberty needs moral, cultural, and institutional supports and that liberty is not an end in itself. Such libertarians may be closer to the traditional conservatives than they realize. Their “libertarianism” does in fact suggest the kind of philosophically tenable rapprochement between liberals and conservatives that Meyer’s “fusionism” clearly failed to achieve.

Joseph Baldacchino is president of the National Humanities Institute and editor of Humanitas.
This article was originally published by the National Humanities Institute in
Epistulae No. 11, December 2, 2010.