ABSTRACT: The idea of a “Christian America” holds both myth and significant meaning. On the one hand, American history offers little evidence of a distinctly Christian founding; many of the Founders, in fact, actively opposed Christianity and sought its disenfranchisement in the new republic. On the other hand, the decades after the Founding saw a surge of Christian faith throughout the country. By the eve of Civil War, America could justifiably be called a “Christian nation,” but its Christianity was cultural, not political, the result of vigorous local and national enterprises rather than governmental action.
Here’s the link to the article: Guelzo: Was America Ever Christian?
Read More