Evangelical leaders echo Obama, say U.S. not a Christian nation

President Obama has taken plenty of heat in conservative Christian circles for a remark he made in 2006 in which he said that that United States was no longer “just” a Christian nation, but was religiously diverse. Now, it turns out, he has allies for that view: evangelical Christian leaders.

In a statement issued Tuesday, the National Assn. of Evangelicals said that when it surveyed selected evangelical leaders about whether the United States was a Christian nation, 68% said no.


When has our Constitution ever recognized Christianity?

It has been a nation of predominantly Christians…but it is not a Christian nation in the sense that some nations are “Islamic” in the present or “Catholic” in the past.

Or any other faith, for that matter.

I believe that faith is stronger when it is a choice. You can’t get authentic faith by force of law. You can get people to obey laws, but you aren’t going to change anyone’s hearts that way.

I try and respect everyone’s journey whether they end up at the same end as I do or not.

That doesn’t mean the state shouldn’t recognize the Church though. ;):cool::smiley:

Recognize it how? I’m not sure what you mean by that.

I think this is a loaded question. Yes we are a christian nation in the sense that most of us are christian, but our laws as much as they may be inspired by some christian teachings, are not christian. Also, I think the pastors in the National Association of Evanglicals see this differently than Obama. They rightfully see America as a nation where christians are discriminated against and are being pushed out of public life.

Obama views this as a good result; The evangelicals view this as a symptom.

Well said! :thumbsup:

Official religion of the state. :wink:

The US has no state church. So in that sense it is not a Christian nation. The majority of the people are at least nominal Christians. So in that sense it is a Christian nation. But in a very disturbing way it is an anti-Christian nation. The government has determined that public displays of Christian faith are often illegal and restricted such displays.

Several of the original thirteen colonies had state churches. Several had religious tests for office. So Christianity was the official religion of the US in a very real sense at one time. And to be clear this was even after the signing of the current federal constitution.

Very educated and well said. :thumbsup:

Yes. Before the American Revolution, the Church of England was established by law in my state of South Carolina. After the Revolution, Protestantism in general was established. Later, equality was extended to Catholics and Jews. South Carolina’s constitution Article VI section 2 still requires that any candidate for office acknowledge belief in “the Supreme Being” (though because of Supreme Court rulings state constitutional provisions such as this have no legal effect).

Many other states gave legal recognition to the Anglican Church, the Congregational Church, and the Dutch Reformed Church.

And that’s why I do NOT want a state religion because it is very unlikely that state religion would be Catholic.

Maryland was founded by Catholics, and for awhile they ruled the colony. But that didn’t last long.

We’re not a Christian nation. A majority of it’s residents are Christian but we are a nation of many plural beliefs as well as those with no belief in a God or higher being. We are a melting pot of the world. There are Roman Catholic Christians, Protestant Christians, Anglicans (I list them separately only because the Catholics on here will call them Protestant but Anglicans may not), and there are Jews, Muslims, Buddhists, Hindus, Unitarian Universalists, agnostics, atheists, and so many others. I apologize to anyone in our nation who adheres to one I didn’t list. But you get the picture. A nation of many. May we strive as a nation to respect our differences and to live in peace with one another.

Most of the Founding Fathers were Dietist. Thomas Jefferson cut out of his Bible all of the miracles that Christ did. Christ became nothing more than a good moral teacher. It is fiction that the United States was founded as a Christian nation.

In my opinion the claim that the Founding Fathers were Deists is an oft repeated claim that is lacking in evidence and explanatory power. We need to define who the Founding Fathers were. Were they those who signed the Declaration of Independence? Those who created the current federal constitution? People forget we had another national government for the confederacy of sovereign states for eight years. Why are our founders only politicians? Was our country not also founded by businessmen, farmers, laborers and even clergy? Why do we only care about the thoughts of those who seek and obtain political power? Why do we only care about the federal government when the US was a confederacy? Our nation was just as much founded by the people and the states who clearly were Christian as indicated by established churches, religious tests and more.

I’d also point out that I think this argument that the Founders were Deists tends to focus at certain points in peoples lives and ignore any contrary evidence. John Adams wrote the Massachusetts constitution which among other things called for taxing citizens to pay for clergy to be taught so that they could spread the Christian faith. George Washington as president offered a Thanksgiving Proclamation where he asked the citizens of the US to pray to God and ask for blessings. That is hardly a Deist concept. The Continental Congress opened with prayer to Jesus Christ. Were most of the Founders thinking that was a bunch of nonsense? Did they start their day listening to what they thought was a lie? Maybe they did. And if so they were charlatans and their opinion is not noteworthy.

All true, and yet our basic founding is in the Judeo-Christianmold. Our laws regarding conduct, the recognition of God in our founding documents, leads to a conclusion that we are, without being so by statute, a Christian nation.
While e pluribus unum is true, so is “In God we Trust”, and based on the overwhelming faith of those who brought this nation into being, that essentially means Christian.

More important though, I believe, is the danger of secularizing our history. The very basic premise of American governance was and is the belief that individuals have rights received from the Creator (however you describe creator), rights that are antecedent of government, and government gets its power by consent of the people. This turned the European monarchy system on its head, as they believed that the monarch (government) got its power from God, and in turn gave individuals certain rights that it sought fit.
If we remove God as the grantor of individual rights, then government will return to its position of power over the people, and the provider of rights. IOW, tyranny.


Is this the Americanization that acculturates immigrants to American customs and values, or the type that goes international by bringing American food/technology/pop culture to other countries, along with ideas pertaining to business and politics? Either way, I struggle to see how this is any more anti-Christian than the enduring Latin influence on art, culture, language, and practices in certain parts of the world.

DISCLAIMER: The views and opinions expressed in these forums do not necessarily reflect those of Catholic Answers. For official apologetics resources please visit www.catholic.com.