America's Christian Influence

There has been much debate over whether America was originally founded as a Christian nation. The Christian influence in our founding is undeniable. What follows are some articles that show historic links, as well as some book references, that prove the strong influence of Christianity on our founding as a nation.