AMERICA FOUNDED AS A CHRISTIAN NATION? Americans are so sad over being sold out by their so-called representatives in DC. It’s so sad when there are Americans who still think the system works like they were taught in school. It’s like an older person with dementia who wants to keep going outside to water the flowers at a home that has long since been gone.
Controversial indeed. Who among Americans doesn’t like a good yarn about good vs. evil–with the USA being the Good Guys? Count this writer among those who at one time believed the “USA was founded as a Christian nation” fairy tale.
Actually, the best way to determine whether or not the United States is a Christian nation is to look at the founding documents. Nowhere, not even once, does any of the USA’s founding documents mention “Jesus” or “Christ.” Pretty odd when a country doesn’t mention the person it supposedly was founded for, huh?
Still, this article raises some good points.
h/t: Citizen Tom
- Was America once a Christian nation? (loopyloo305.com)
- Do You Know if You are Going to Heaven? Be Sure!
- Was America Founded as a Christian Nation? (Review) (davehershey.wordpress.com)
- A Christian Nation (canadafreepress.com)
- Assumptions of a Christian Nation (davidrgriffiths.wordpress.com)
- President Obama Declares The Future Must Not Belong to Practicing Christians (jordanwellsministries.wordpress.com)
- Eternal Salvation Through Jesus Christ: Salvation Messages from End Times Prophecy Report
Originally posted on altruistico:
It may seem intuitive, at first, to attempt to answer this question by focusing on government. But the best way to determine whether or not the United States is a Christian nation is to compare the philosophy of its people to the Word of God.
The Declaration of Independence states that every person has these God-given, inalienable rights: life, liberty and the pursuit of happiness. This philosophy is what we could call the “American Worldview,” and it drives everything about the nation— from its economic and foreign policy to the private lives of its people. This is the atmosphere in which most of us have grown up. But can this American Worldview be called a Christian Worldview? Can we really call the United States a Christian nation?
First, what does “life” mean to a Christian? Most Americans would say we have a right to be alive, just by virtue of having been born. Most Americans would say we have the right to do with our lives as we choose, because our lives belong to us. Christianity agrees that we have the “right to life” and recognizes that life comes from the Creator, just as the Declaration says. However, the Christian (biblical) view is that the right to live does not exist by virtue of being born, but by virtue of being created first in the mind of God (Jeremiah 1:5). Acts 17:25 says that God “gives to all mankind life and breath and everything.” The Bible is saying here that the life of man is sustained by God, and as such, it belongs to Him. But Americans generally believe that we are free to do with our lives just as we please because we believe our lives belong, primarily, to us. For a Christian, God’s law is the absolute truth, and the final authority. It tells the Christian “Thou shalt not murder” and “Thou shalt not bear false witness.” But the United States shows, both by the lives of her citizens and the laws passed in her courts that she does not recognize the authority of God, nor respect His laws.