Saturday, April 23, 2016

When did Christianity start to lose its hold on American Culture?

Taking the long view, I would say around 1770.

Yes, 1770, give or take.

While many evangelicals see the revolutionary period as a golden age in American history -- and in many ways it was-- Christians during that era frequently lamented that religious commitment had waned. While such jeremiads have been common throughout American history, there is reason to think that in this instance they were correct. Clearly, religious fervor had declined since the end of the Great Awakening around 1750. The clergy claimed that the realities of the Revolutionary War distracted the population from religious exercise, and in the aftermath of the war Christians became concerned at the influence of French skepticism making inroads in the culture.

The Second Great Awakening, which began around 1800, provided a religious response to these trends, but it was one that did not come without a cost. As American religion became democratized, traditional forms of religious authority began to be marginalized. In some ways that was a positive for American Christianity: the growth of a menu of democratic religious options meant that most of the people could find something to their liking, unlike in Europe where more limited choices resulted in people dropping out. Nonetheless, Christianity during this period became more emotional and anti-intellectual, which would become problematic as the nation became more diverse. In addition, American Christians became more concerned with religious fervor than with coherent commitments to historic orthodoxy.

Nonetheless, the Protestant mainstream maintained a quasi-establishment role in the country through the 19th century; however, once again, it did so at a cost. The population of the country was becoming much more diverse, as the result of democratizing influences, frontier expansion, immigration, and other trends. This diversity resulted in greater religious pluralism. Thus, in order to maintain that Protestant establishment, American public religion came increasingly to emphasize moral over doctrinal concerns. There is an irony in the fact that many modern evangelicals will cite early American affirmations of the need for religious morality for the health of the Republic without realizing that such moralizing actually represented a decline in Christian specific content proclaimed by the churches.

The standing of the Protestant establishment became further strained in the latter half of the 19th century, as industrial and academic trends tended to push Christian concerns and activism toward the margins. Urbanization proved challenging to the essentially agrarian outlook of much of American Christianity. As American industry began to look to colleges to produce workers capable of meeting the needs of the new economy, the presence of clergy among college directors seemed less advisable, and they began to be replaced with business leaders. In addition, naturalistic philosophy, as well as scientific Darwinism, took American academia by storm. Even in divinity schools, naturalism began to have influence, with the result that many schools began producing ministers that no longer believed in Christianity. One result of this was the withdrawal of many religious conservatives from mainstream culture in a movement known as fundamentalism.

By the 1950's, as noted by historian George Marsden, American thought leaders shared a belief in the goals of America's founders, but they no longer believed in the basis for those goals, which largely consisted of a belief that inalienable rights came from a creator. Efforts at resolving that gap failed, resulting in the deterioration of America's civil debates into an era of identity politics that has continued up to the present. Thus, what emerged in the 1960's had obvious roots in the culture of the 1950's. In that atmosphere of identity politics, beginning in 1980 the Moral Majority (note the emphasis on morality without  any theological commitment and majoritarianism rather than shared commitment) provided foot soldiers for conservative victories for a while, though anyone looking at the demographics of those holding varying views should have recognized that, in fact, the moral relativists had already won. As older generations died out, they would be replaced by those without the same moral, much less theological, commitments.

Thus, American culture had changed long before the recent same sex Supreme Court rulings made the country less hospitable toward evangelical views. Many have responded with a sense of betrayal -- in spite of past Court actions (particularly Roe v. Wade), many still considered this to be a "Christian country." It would benefit them to think more carefully about both their theology and their history.

Of course, Christianity remains broadly influential in the United States, though it may never again hold the level of respectability and influence that it has had in the past. However, that is not entirely a bad thing. Christians should now be humbled from their quest to impose some form of Christianization on the country by political means, thus providing an opportunity to return to basics, i.e., the preaching of the Gospel. The reality is that Christianity has always taught that sound theology is at the core of what the church is, with moral action being the outworking of what we believe. Christian teaching has become sloppy, often not being adequate for those within the church and not providing a compelling message for those outside.

With the lack of social influence, perhaps American Christians can repair those breaches. These days, everyone knows what evangelicals think about gay marriage. The church will be healthier when the world knows what the church thinks about justification by faith.

No comments: