I was appaled reading the article "Where should Christians draw the line in trying to make the U.S. a Christian nation?" on this site.
Look, nobody is trying to take away your religion. You are free to practice it as you wish. But you CAN NOT impose it upon others, and you CAN NOT force it into a mandatory pledge.
How can you say this? Yes, the majority may be Christian, but FOUNDED on Christianity? I think not. Washington, Jefferson, and Adams, just to name a few, were Deists. Franklin was an athiest.In terms of religion, it is not a matter of "trying to make" the United States a Christian nation. The U.S. has always been a Christian nation from the earliest colonial settlements. It remains a Christian nation today. The vast majority of U.S. citizenry claim at least a nominal adherence to Christianity, and, amazingly enough, the majority of our new immigrants make a similar claim -- even those from Asian countries. The U.S. was founded upon Christian principles and beliefs.
Yes, the concept has changed. Now it exists. America wasen't meant to be a Christian nation.In America, this "line" has shifted over the years. In other words, as the definition of a "religious establishment" has changed, the concept of America as a Christian nation has also changed.
Was it ever not?In America, it has become politically incorrect to refer to the U.S. in religious terms according to the religion of the majority.
As I already said, the majority of the founding fathers were Deists.Such was not always the case. Historically, throughout the world, America has always been considered a Christian nation. Why? Because Christianity was the dominant religion of the land. Christianity shaped the thinking of America's forefathers and informed the making of public policy and law. Christianity was a part of the common law; it was woven throughout the fabric of American life.