Archive for the ‘Education’ Category

What Questions Are We Asking?

While discussing the place of Christians in contemporary academic and intellectual life, Charles Taylor writes in his concluding reflections to A Catholic Modernity? about how an atmosphere of unbelief has shaped not just the answers given in those places, but even the questions that are asked:

Add to this that beginning students are rarely too clear about what remarks they want to make anyway; we have more in the nature of confused intuitions at that stage (indeed, we have a lot of those at this stage, too), and we can easily understand how a student slides into a pattern of conformity, which may then become a lifelong habit.

A striking example of this preshaped agenda is the aspect of moral theory which I talked about in Sources [of the Self] and again in my lecture here. I argued in the lecture that a key issue for our times is that of moral sources, whether, for instance, we can maintain the high level of philanthropy and solidarity we now demand of ourselves, without these degenerating into their opposites: contempt, the need to control. The issue here is the quality of our moral motivation–in more old-fashioned terms, the quality of our will and the nature of the vision that sustains it.

Plato or Aristotle would have understood what I was talking about, although, of course, not the Christian or modern reference points of my discussion. But modern moral philosophy, particularly in the analytic world, has undergone a drastic foreshortening. These issues just fall off the agenda. For those thinking in the wake of the utilitarians and Kant, for instance, the principal moral question is, What ought we to do? (as against What is good to be? or What should we love?), and the principal task of moral philosophy is to find the principle or principles from which we can derive what we ought to do (the greatest happiness, or universalization, of whatever).

I was struck in some of the comments on Sources by how many people couldn’t seem to grasp what question I was addressing. They took “moral sources” to be another name for the highest principles. They literally couldn’t think outside the contemporary agenda.

But, one wants to protest, don’t you see that it also matters whether people can actually bring themselves to do the right thing? But then your interlocutor looks at you blankly and says: of course, but that’s not moral philosophy; how people actually get motivated, that’s in the domain of psychology, or sociology, or whatever.

In other words, these two issues, what we should do and how we come to do it, which were unproblematically seen as part of the same inquiry by Plato, Augustine, and just about everybody else until the last three centuries, have been neatly sundered and placed in noncommunicating intellectual universes. (119-120)

In Praise of Great Men

wilberforce

Our “post-modern” age (if there is such a thing) is sometimes skeptical, and even cynical, about “great men”. And frequently, that cynicism is not entirely unjustified. Great men, in the sense of historically significant men, have often not been good men. That being acknowledged (and it cannot be ignored), I want to make two brief points in reply.

Firstly, while I would not wish the “new” kinds of history to disappear, I don’t think history can be done without significant study of “great men”. This is because history is the search for the causes of contingent events, and when it comes to human history, those causes can and do include the actions of individuals. And “great men” are called that precisely because they produced widespread effects. For this reason alone, a complete understanding of history cannot overlook them.

Secondly, I think there’s something to be said for the value of great men in moral education. Certainly, educating people to be virtuous, to be good, is more important than making them “great”. To some degree, the extent of our effects on the world is a matter beyond our control; our character, however, is something much more directly under our control. So it makes sense to focus more intensely on changing the things we can change. Nevertheless, I don’t see what’s intrinsically wrong with having widespread significance, and so I don’t think it can be intrinsically wrong to desire such a significance. Pride in one’s greatness would of course be wrong, and greatness provides an occasion for pride that is not present otherwise. Yet, excellence in anything, including character, provides such an occasion. So the hazard does not ultimately make such a goal illicit.

Further, while I think utilitarianism is wrong to say that the only standard of right is “what maximizes the good”, still, taking into account that other things are intrinsically wrong, all other things being equal, “maximizing the good” is a noble goal. And I’m not sure how one can have such an aspiration without implicitly desiring to be “great”, at least in the sense of “causing widespread effects”.

Finally, it is true, most people will never be “great”. For this reason, it is unreasonable for an individual to assume they will be great, and it is even more unreasonable to place that expectation on others. Because our “greatness” in a historical sense is contingent upon so many things beyond our control (above all, the will of God), we can never take such a verdict on our life for granted. Yet, with that important qualification in place, I don’t see a reason to absolutely prohibit a desire to have widespread beneficial effects.

So I still think there’s a case to be made for studying “great men”, and studying to be among their company.

Should We Make Curriculum More Relevant?

From Roger Ebert’s glowing review of the timeless Mr. Holland’s Opus:

Watching this film, falling into its rhythm, appreciating its sweep, I could not help but remember my own high school teachers. Sitting here at the keyboard, I began a list of their names, only to realize that you have your own list. Amazing, how clearly I remember their excellence, and their patience. One anecdote will do. Stanley Hynes, who taught us Shakespeare, always addressed us as “Mr.” and “Miss,” as a college teacher would do, and somehow that brought a greater seriousness to “Macbeth” and “Julius Caesar,” which were uncharted new worlds for us. Modifying the curriculum to make it more “contemporary” and “relevant” is doing an injustice to students, whose lives will become relevant to the exact degree that high school encourages them to outgrow themselves, and escape from the contemporary into the timeless. Mr. Hynes knew that. So does “Mr. Holland’s Opus.”

Private Religious Education Contributes More To The Common Good

It turns out that private religious education doesn’t pump out “ultra-rich snobs and religious hardliners”, at least, according to a recent education survey by Cardus. More than 2,000 former students between the ages of 24 and 39 who attended separate Catholic, Evangelical, or private Christian schools were queried on issues such as charitable donations and civic involvement.

The study concludes alternative school graduates are as likely, if not more so, to be valuable contributors to the “public good.”

Students at independent schools make up for about 8% of Canada’s school-age population, and includes Catholic schools in Ontario funded by the government.

Lead researcher Ray Pennings says the findings help shatter the stereotype that alternative schools have a negative impact on Canada’s multicultural fabric.

“The perception has been that independent schools were for rich kids and religious kooks who were focused on themselves,” said Pennings. “What the study actually shows is that whether it’s social engagement, donating or volunteering, the graduates of these schools are achieving the objectives of public education at equal or greater proportions than the public school systems.”

Graduates of non-government schools took part in more neighbourhood and community groups, but more significantly, they were more likely to vote and participate in grassroots movements. (HT: James KA Smith)

Solid Advice On Blogging

We’ve already done this before on the blog by way of showing what Andrew Sullivan thinks on the topic. Given my own pathetic contributions to the blog as of late, I’d thought I’d share some solid blogging advice I just came across. This comes from Tom Bennett,  a British education author I stumbled across while coveting books in the infamous Blackwell’s  in Oxford last summer. As I’m a teacher, I’m trying to get my hands on everything that Bennett writes, so I was delighted to see that he has a blog.

Here are Bennett’s gems:

1. Write because you have to.

Write because you itch to write; because there’s something to be said that hasn’t been said before and you need to be the one who says it. Write because you own some part of the truth that nobody else possesses. Write because you’re lying awake at night and a particular arrangement of words won’t leave you alone, nudging you to trap them on page or screen. Write because you can’t sleep for not doing so. Write because a day without writing makes you feel empty and indolent.

This is important. Don’t write for money, at least not at first. If money is your primary concern, then get a paper round. Don’t write because you want to be famous: drive your car through a shopping centre if you want that. Write because you are a writer, and that is what you do.

2. Say exactly what you want to say.

Say what you want to say in the most direct way you can. Don’t worry, as you write, is this good enough? Worry about that later. Instead, write about the thing that makes you angry, sad, upset, agitated or elated. Write your truth, in your voice, but don’t worry about what your voice is; just speak. Say it in exactly the way you want, and don’t worry about offence. That comes later. But if you self-edit at this point, you dam the river of words that bubbles and boils inside you.

3. Edit your work.

You should write as if no one is reading it, then edit as if everyone is. Fix grammar and spelling first. Try reading it aloud to yourself. How does it sound? By the end of the piece you usually have a better idea of how it should start, so unpick, unstitch, and most of all, hack away. Gut anything that doesn’t add to the meaning. You love a phrase but it doesn’t serve the whole piece, or it detracts or distracts? Get rid of it. Save it for later. If it’s good you’ll find somewhere to plant it.

If a paragraph doesn’t help the whole thing, cut it out. If you find one half goes in a different direction, cut it in half, like Solomon, and decide which half you love most. Finally, edit for libel, and ask yourself if you have inadvertently caused offence. Nothing wrong with offence at times, but make sure it’s advertent. And legal.

4. Blog regularly.

I feed robins in my garden. At first they must have been delighted by my RSPB coconuts. Eventually they started coming back looking for more. Now, they’re fat. Good.

5. Interact and promote.

You want a readership? Of course you do, otherwise you wouldn’t be blogging: keep a diary instead if it’s just the love of writing, because the love of being read is another requisite of the whole thing. Respond to comments. Use networking sites like Twitter to spread the word. Leave links as parts of discussions on other blogs and education websites.

6. Be prepared for the bouquets and the brickbats.

And finally, the most important thing is to be prepared for a reaction. Some will laud you, and some will damn you. As Malcolm Tucker says in The Thick of It, ‘Are you prepared to be a dartboard?’ The answer has to be yes. But that doesn’t matter, because some of the darts will be flowers, and because blogging is something you love to, have to, do. And if people still troll, even when you’ve genuinely attempted to engage with them, then, as Christopher Hitchens said, ‘They can take a ticket and get in line to kiss my ass.

But first you have to write the damned things. Stop thinking about it. Just write.

So, hopefully this will inspire me to write in more than four month intervals.

Religion Scholarship And Ivy League Admissions

Several days ago I spent some time reading through an extensive article by Ron Unz at The American Conservative on admissions practices in Ivy League schools in the USA, entitled “The Myth of American Meritocracy“. While I certainly cannot do justice to all the arguments and stories he marshalls to make his points, I want to draw out an implication from one statement he makes. He writes:

In recent decades, elite college admissions policy has frequently become an ideological battlefield between liberals and conservatives, but I would argue that both these warring camps have been missing the actual reality of the situation.

Conservatives have denounced “affirmative action” policies which emphasize race over academic merit, and thereby lead to the enrollment of lesser qualified blacks and Hispanics over their more qualified white and Asian competitors; they argue that our elite institutions should be color-blind and race-neutral. Meanwhile, liberals have countered that the student body of these institutions should “look like America,” at least approximately, and that ethnic and racial diversity intrinsically provide important educational benefits, at least if all admitted students are reasonably qualified and able to do the work.

My own position has always been strongly in the former camp, supporting meritocracy over diversity in elite admissions. But based on the detailed evidence I have discussed above, it appears that both these ideological values have gradually been overwhelmed and replaced by the influence of corruption and ethnic favoritism, thereby selecting future American elites which are not meritocratic nor diverse, neither being drawn from our most able students nor reasonably reflecting the general American population.

The overwhelming evidence is that the system currently employed by most of our leading universities admits applicants whose ability may be unremarkable but who are beneficiaries of underhanded manipulation and favoritism. Nations which put their future national leadership in the hands of such individuals are likely to encounter enormous economic and social problems, exactly the sort of problems which our own country seems to have increasingly experienced over the last couple of decades. And unless the absurdly skewed enrollments of our elite academic institutions are corrected, the composition of these feeder institutions will ensure that such national problems only continue to grow worse as time passes. We should therefore consider various means of correcting the severe flaws in our academic admissions system, which functions as the primary intake valve of our future national elites.

Unz draws attention to the deleterious effects the current process will have on the nation’s economy and society. But students of religion should note that these same schools are producing the “best minds” working on religion. Yet, if we recognize the large influence of corruption, both for financial and ideological reasons, we must, if we are reasonable, adjust our respect for the consensuses of these institutions downward accordingly.

I think one “take-away” from this article on the state of humanities and religion scholarship is this: we ought to simply recognize brilliance and foolishness wherever they appear, rather than give an argument more weight because it came from a prestigious institution, or less because it did not.

Fitness And Mental Excellence

I’m infrequently chipping away at John J. Ratey’s book, Spark: The Revolutionary New Science of Exercise and the Brain. What struck me immediately about this book is that, while majoring on science, it recognizes that its practical advice is perennial. The epigram that begins the book:

In order for man to succeed in life, God provided him with two means, education and physical activity. Not separately, one for the soul and the other for the body, but for the two together. With these two means, man can attain perfection. – Plato

He begins his book with the case of Naperville Central High School. This school begins its day with a fitness regiment for students, focused not on “sports”, but, as I said, on fitness. This is accomplished through simple running, with the goal of raising the heart rate of students, determined with monitors for each pupil. The science Ratey goes into in his book suggests there are many benefits to physical exercise on the brain and mental processes in specific. But the most memorable part of the first chapter for me was the following:

Those exams aren’t nearly as telling as the Trends in International Mathematics and Science Study (TIMSS), a test designed to compare students’ knowledge levels from different countries in two key subject areas. This is the exam cited by New York Times editorialist Thomas Friedman, author of The World is Flat, when he laments that students in places like Singapore are “eating our lunch.” The education gap between the United States and Asia is widening, Friedman points out. Whereas in some Asian countries nearly half of the students score in the top tier, only 7 percent of U.S. students hit that mark.

TIMSS has been administered every four years since 1995. The 1999 edition included 230,000 students from thirty-eight countries, 59,000 of whom are from the United States. While New Tier and eighteen other schools along Chicago’s wealthy North Sore formed a consortium to take the TIMSS (thereby masking individual schools’ performance), Naperville 203 signed up on its own to get an international benchmark of its students’ performance. Some 97 percent of its eighth graders took the test–not merely the best and the brightest. How did they stack up? On the science section of the TIMSS, Naperville’s students finished first, just ahead of Singapore, and then the North Shore consortium. Number one in the world. On the math section, Naperville scored sixth, behind only Singapore, Korea, Taiwan, Hong Kong, and Japan. [13-14]

Ratey does acknowledge that Naperville resides in a demographically advantaged school district, in terms of race and income. Yet he argues that the coincidence of unusual phys-ed and exceptional science scores is too interesting to dismiss out of hand, especially in that Naperville is far from the only “wealthy suburb in the country with intelligent, educated parents. And in poor districts where Naperville-style PE has taken root, such as Titusville, Pennsylvania…, test scores have improved measurably.” (15)

Some Political Musings

I’ve been reflecting a bit on the significance of the US presidential election. Especially the voting habits of the “millennial” demographic. The following are some unorganized and totally biased suggestions I have for how churches could be discipling their members in political matters.

1. Natural law. Christian conservatives sometimes speak and act as if the order of grace totally replaced that of nature. The practical application of this principle becomes a wholly divine positive law approach to policy. A bible verse must be found for everything, and that becomes the whole (or basically the whole) of our argument. I think a recovery of natural law thinking can be a corrective here. It can help both believers and unbelievers to see that God’s commands are not arbitrary impositions of a cosmic despot who wants to spoil our fun; they are instead simply the will of God inscribed in our very nature. God tells us to act the way he made us to act, and when we act the way he designed us to, we flourish, not wither. This suggestion will imply recovering a great deal of Christian jurisprudence, antique, medieval, and modern, Roman Catholic, Eastern Orthodox, and Protestant. It probably should include knowledge of the jurisprudence of other cultures, too (Arabic, Indian, Chinese, Jewish, etc.). Of course, not every individual would need to study all these things, but seminary professors probably should have some knowledge of it all, and it should filter down in vocationally appropriate ways through the ministers to the laity.

2. More specifically, in recovering natural law teaching, the conservative Protestant church should be open to rediscovering a different part of the political “spectrum” than it currently inhabits today. The tradition of the church really is socially “conservative” in most ways, but not economically libertarian. The American libertarian tradition is really the odd one out on this, and it is important to at least recognize this, even if a person or institution ultimately wants to side with it. I think, though, conservative Christians might be able to have a real effect on the political world, in the long haul, if they were to present an actual third way: a politic that strives for the common good in all areas of life, not just sex and death issues.

3. Also more particularly, the church desperately needs to get back in touch with the just war tradition, and to a lesser degree, even its pacifistic heritage. It needs to understand why it is not pacifist, if it is not going to be pacifist, and it needs to understand how its approach to war is grounded in something other than sheer brute desire for conquest and self-satisfaction. The just war tradition, ultimately, is guided by (duh) justice, which means “to each as he deserves”. It means a preferential option (not indefeasible) for the preservation of life, a respect for order, and proportionate and discriminate use of force (not terrorism and blood lust).

4. This may go without saying, but I’d like to say it anyway: part of discipleship in our age needs to be the inculcating of a critical distance from partisan politics. Though we may vote a certain way in elections, we ought not to be doing this out of loyalty to a brand. It ought to be the result of a careful moral calculus. And Christians should not have a problem criticizing even their own preferred candidate or policy if it is defective from the point of view of ideal justice. I am firmly convinced that a vigilant people is the only thing that will keep a society free and just, and the church cannot contribute to such a society if it is basically in the bag for one power-seeking party or another.

FWIW.

Plato – Religion in the Classroom

My buddy Michael Plato (no relation to a certain expositor of Socrates) teaches Popular Culture, Film, and English at Seneca College in Toronto. He and I both are members at New City Baptist in the city’s downtown core. A week or so ago Mike did a three-part seminar on Mormonism that was fantastic. Some years ago he did a documentary for PBS on Mormonism, and even had the opportunity to interview Harold Bloom for it. Sadly, as with so many documentaries, it didn’t air. Mike’s been a huge resource of English lit for me, you can read about his literary habits in this interview here.

Michael recently wrote a piece for Seneca’s The SELS Review Vol. 2 (Sept. 2012) called “Religion in the Classroom.” In it he reflects on ways that professors can understand and respect the religious perspectives of their students. I asked Mike if I could post it here, even though it won’t be published until September. He was amenable, so here it is (I omitted the bibliography, so you’ll have to chase up the references yourself):

Not so long ago it was understood that if a family was going to survive Thanksgiving, three subjects were never to be raised at the dinner table: money, politics and religion. Extended families being fragile entities at the best of times, this maxim still makes good sense. The classroom, on the other hand, is the place discussion should be allowed to roam freely and widely, the more challenging, the better. That at least is how we like to perceive it in our most idealistic moments. Yet while debate on economics and politics are often encouraged in many classrooms, when religious views are raised, an unease or discomfort is more often the result. I do not think I am the only one who has felt that awkward silence when a student suddenly interjects an explicitly religious view into a class conversation. As religious historian George Marsden notes, even among those with strong religious convictions, “separation of faith and learning [are] widely taken for granted in our culture” (Marsden, 1997, p.5). As such, we are unsure of how to proceed when they collide.

To be sure, some of this apprehension may be the result of a paranoia stemming from political correctness – the fear that we may unintentionally offend someone of a different cultural background. More likely, it is founded on the growing assumption that religious views have no place in the public square, and by extension, the schools. This view is most succinctly articulated by Richard Rorty when he described religion as a “conversation stopper”. For Rorty, the exclusion of religion from public discourse is a necessary or pragmatic one. It is divisive and time-consuming to constantly argue over religion, while secular reasoning, as he sees it, is universal and available to all (Rorty, 1999, pp.166-167).

Stephen L. Carter of Yale responds, however, that it is impossible to leave our religious views behind when we are dealing with any issues of belief or moral reasoning.

Efforts to craft a public square from which religious conversation is absent, no matter how thoughtfully worked out, will always in the end say to those of organized religion that they alone, unlike everybody else, must enter public dialogue only after leaving behind that part of themselves that they may consider the most vital (Carter, 1999, p.90).

To that, author Timothy Keller adds that many secular assumptions are just as “faith based” as many overtly religious ones: “secular concepts such as ‘self-realization’ and ‘autonomy’ are impossible to prove and are ‘conversation stoppers’ just as much as appeals to the Bible.” In other words, as Keller sees it, “statements that seem to be common sense to the speakers are nonetheless often profoundly religious in nature” (Keller, 2008, p. 16).

So where does this leave us in terms of discussing religion in the classroom? Should we be encouraging it? Having little space here to develop strategies for effectively “managing” religious views in class dialogue, I hope at this point simply to address three common assumptions, or blind spots. Simple awareness of these three aspects of the nature of religious beliefs will not resolve all crises or concerns that may arise, but they will give the classroom facilitator a greater sensitivity to the views of students with strong religious convictions, and may help to make the classroom a more amenable – and less awkward – place for religious, or more generally, “belief based” discussions. They are as follows: 1. The non-syncretistic nature of most religions. 2. The superior claims of religion. 3. Religious beliefs have wider worldview implications.

A common assumption that has been growing in the west, at least since the Second World War, is the syncretistic view of religion. That is, all religions are basically about achieving the same goals or ends. A commonly used metaphor for this view is the one which states that the world’s religions “are different paths up the same mountain.” Promulgated by authors such as Huston Smith and Joseph Campbell forty and fifty years ago, many contemporary religious scholars now consider this view to be well meaning, but incorrect and often, inappropriate. As Stephen Prothero puts it:

No one argues that different economic systems or political regimes are one and the same. Capitalism and socialism are so obviously at odds that their differences hardly bear mentioning. The same goes for democracy and monarchy. Yet scholars continue to claim that religious rivals such as Hinduism and Islam, Judaism and Christianity are, by some miracle of the imagination, essentially the same, and this view resounds in the echo chamber of popular culture (Prothero, 2010, p.1).

One example of religious syncretism that is commonly found in modern western society is what Harold Bloom calls the “myth” of the “Judeo-Christian Tradition”. Though both faiths, Judaism and Christianity, share some historical origins, texts and concepts, the foundational formulations of each are radically “antithetical to each other” (Bloom, 2005, p.234). For believers of these faiths, syncretistic assumptions of them being essentially “the same” can be perceived not as insightful or unifying, but rather as patronizing. The same could be said for the common habit of lumping Buddhism, Hinduism, Confucianism, Shintoism, etc., into the category of “eastern religions”.

For many who do not profess faith, religion is no more threatening than when it is being proselytized. Practiced certainly, studied of course, but any attempt to convert someone else to it is seen as both arrogant and wrong. It smacks of superiority, even ethnocentrism. Yet, if we are completely honest, this view itself is likewise just as ethnocentric as any other religious or cultural claim. People from the majority of non-western cultures have no problem saying that their religion and culture is best (Keller, p.12).  For us to say that such a stance is wrong is in fact our own form of cultural superiority. We think that those who consider themselves superior are in reality inferior, in the end securing our own sense of superiority.

Mark Lilla, a professor of government at Columbia University, demonstrates this irony when he realized what was happening when he tried talking one of his students out of becoming a Christian:

I wanted to cast doubt on the step he was about to take, to help him see there are other ways to live, other ways to seek knowledge, love…. even self-transformation. I wanted to convince him his dignity depended on maintaining a free, sceptical attitude towards doctrine. I wanted… to save him…(Lilla, 2005, p.95).

Lilla’s self-knowledge reveals that his doubts about Christianity were in fact a learned alternative faith. We are all ultimately exclusive in our beliefs about religion, but in different ways. As such, when we hear proselytizing in the class, we should be aware of how our response may be its own form of proselytizing.

A final point to consider is: what are the larger worldview implications of religion? Religions are not simply a set of ritualistic practices or collections of narratives which engender personal fulfilment, but are in fact sets of presuppositions which are held about the nature of reality and provide a foundation on which the believer lives, moves and has being. James W. Sire points out that worldview answers such fundamental questions as: What is external reality? What is a human? What happens at death? How do we know right and wrong? And what is the meaning of history? (Sire, 2004, p.20)

Understanding religion as encompassing these widely diverse dimensions may help to explain why certain religious traditions respond in the way they do to certain aspects of culture and teaching. A prime example of this would be the antipathy many Christians feel towards the theory of evolution. What offends these Christians about the theory is not the complexity of the science, or that it does not correspond to the description of creation in the Bible, but rather the way certain secular thinkers use evolutionary theory to suggestively undermine supernatural claims which are beyond their scientific purview (Cunningham, 2010, pp. xvi-xvii).[1] In order to comprehend some religious responses to specific, seemingly non-religious subjects, such as the sciences or politics, a deeper understanding of one’s own worldview presuppositions and how they interact with or challenge those of a religious worldview may be necessary.

I do not intend for these cursory observations to be any kind of final statement determining practices within classrooms, much less act as the basis of college policy. Rather, I hope merely to draw attention to the complexities of an aspect of our rich diversity which has only begun to be addressed. For much of the 20th Century, a working assumption within schools and western society at large was that religion was a dying phenomenon (Keller, p.3). In 1966, Time Magazine asked on its cover “Is God Dead?” The affirmative answer given then was widely accepted. Religion continued its gradual decline over the next few decades. Then, in the few years following the millennium the trend suddenly appeared to be reversing itself. Surveys began to show that religion was again on the rise (Jenkins, 2011, p.6). The world was beginning to respond to Time Magazine’s query that no, God is indeed not dead.[2] Whether we welcome it or shun it, a decade in, the 21stCentury is turning out to be a century for faith. As educators, the onus is upon us to know exactly what that means… and to talk about it.


[1] An excellent examination of American Evangelical mistrust of evolution and its worldview implications is the essay by Marilynne Robinson, “Darwinism” in The Death of Adam: Essays on Modern Thought, Picador, New York, 2005. pp. 28-75

[2] A number of works have appears in the past few years pointing to the remarkable resurgence of religion globally. Representative of this would be God is Back: How the Global Revival of Faith is Changing the World by John Macklethwait and Adrian Wooldridge, The Penguin Press,New York, 2009.

Some Thoughts On Conservatism

So, since being converted back to a kind of “realist conservatism” by Edward Feser (from libertarianism), I have been pondering what this system implies for more particular political policy judgments. I see clearly what it means for matters like marriage and life and death issues like abortion and euthanasia, but what really intrigues me is more the economic and financial matters. Feser argues (see article linked here) that natural law may support a right of a kind of state education system and a state health care system in certain situations, specifically a system where the people who would receive the assistance were in real distress (i.e., not the whole society generally):

A further possible justification of a right to assistance when in distress vis-à-vis health care and education would be to hold that such assistance falls under the “public good” that the state is obliged to provide for under natural law. The operative principle here is that of subsidiarity, according to which the more central authorities within a society should not carry out any functions that can be performed by the less central ones, though the more central authorities should carry out those that cannot be performed by the less central ones. To the extent that those in distress vis à vis health care and education simply have no other recourse, a right to assistance would arguably follow, if not from the Natural Law Proviso by itself, then at least from that proviso together with the classical natural law theory’s conception of the state and its proper functions.

The extent of governmental assistance such a right would justify is another question, and here I will end with three points. First, what classical natural law theory strictly requires and strictly rules out in the way of practical policy is much less than many partisans of various political persuasions would like. What it strictly requires is a system of private property rights that are robust but not absolute. What it strictly rules out, accordingly, are socialism at one extreme and laissez-faire libertarianism at the other. Between these extremes, though, there is wide latitude for reasonable disagreement among classical natural law theorists about how best to apply their principles, and these disagreements can largely be settled only by appeal to prudential matters of economics, sociology, and practical politics rather than fundamental moral principle. (50-51)

It strikes me, too, that “conservatism” itself is not a term that says much about what particular decisions one should make on those issues. If it just means, “what governments traditionally do”, well then libertarianism is certainly not the answer. In the 18th century, as far as I understand, the British government indeed had welfare systems and practiced economic protectionism of a kind. So would a “conservative” be labelled a Keynesian socialist by today’s standard “conservative”? It strikes me that this gets to the problem with the label in general: “conservatism”, as a label, seems to only mean “wants to change slowly”, but doesn’t determine in any clear way what change exactly is desired, or what end-goals are permitted.

If one wants to go back further (perhaps this is the conservative impulse itself?), to the dawn of the modern era, then most political systems would affirm a natural law, and draw from it several policies which would probably include a mixture of a welfare state (in a very broad sense) and a desire to preserve some private property. Is this what the “conservative” policy really is?

I know changing labels is not an easy thing to do, but as someone who feels forced to identify with “conservatism” in a broad sense in our culture (for lack of a better alternative), I really wish there were a more descriptive word that identified what “conservatism” really is for.