When populist Christians declare that America is -- or once was and needs to be again -- a Christian nation, what do they mean? Christians and secularists often engage in debates about this subject without ever defining terms, which is unfortunate since doing so might alter the character of the debate.
Theoretically, those using "Christian nation" language could be doing nothing more than making a sociological statement, which would in itself be noncontroversial, since hardly anyone would deny that Protestant Christianity, more than any other religion, institutional or otherwise, has had a prevailing influence in the United States and most other western cultures. Even those who are not Christians of necessity must deal in some degree with Christian terminology and norms in order to participate in American culture. It is part of the air that we breathe.
Nonetheless, when those on the religious right speak of a "Christian nation," they are talking about more than sociology. They mean to say that Christianity is normative to what the United States is, and that departure from that norm is a betrayal of national values. Perhaps even more significantly, they would often say, departure from that norm potentially forfeits divine blessings.
Yet, it remains to further ask about the evidence of such a departure and, perhaps more notably, what would a return to being a Christian nation look like?
Do people that talk this way mean that Americans in mass will begin to worship the triune God of Christian teaching, and be justified by grace alone through faith alone in Christ alone?
No, they don't seem to mean that. Rather, people who talk like this seem to have in mind moral reform: if we "returned to God" by "turning from their wicked ways," then people would stop having sex with people other than their spouses, would stop drinking too much or at all, would not go to vile movies, would be less vulgar in their speech, and so forth. While all of these things may be good developments to some degree or another, what those promoting this vision of "Christian America" have in mind is not really distinctive Christianity; it is moralism. They are not looking for belief in the death and resurrection of Christ, but for moral improvement of the nation.
Thus, it is of more than passing interest that those pressing the idea of a Christian nation set off on the wrong foot by misunderstanding what the word "Christian" means. If they reoriented toward a proper definition of "Christian," they might understand that the matter of bringing about Christian commitments is not something a nation can do exercising the power of the sword, but it is something only a church can do through the preaching of the Gospel.
In a different context, the Apostle Paul warned those who exchanged the Christian Gospel for concerns about law keeping in order to establish a basis for entry into God's kingdom that they had "fallen from grace." Those calling for America to be restored to its status as a Christian nation believe that they are saying something about the status of the nation, but, in fact, they are revealing more about the state of the church. It is the church, not the nation, that has lost track of the basic meaning of terms, and it is the church that needs to hear the Gospel again in order to be called out of its apostasy to the good news of Christ.