I'd like to see what everyone's stance on this is. Was America founded as a "Christian" nation or not?

I've found this is often brought up when I debate with er, a certain group of people. They always always bring up stuff about how our founding fathers meant to establish America as a Christian state with Christian principles, blah blah, which is why it is OK to bring up what it says in the bible when arguing politics.

Personally, I think America certainly was not. Our founding fathers acknowledged that government + religion = disaster. Separation of church and state - Jefferson, one of the most prominent founding fathers. As the first amendment clearly says:

"Congress shall make no law respecting an establishment of religion.."

I think it's pretty clearly laid out here.

Also decent video:



His other videos are really awesome as well.