What would y'all say are American values? I would like to say Christianity, but I feel as if though many would say we are not a Christian nation like we used to be. I also think about our rights that so many have died for. This is unique to America.
The United States has never been a Christian nation. The US was established as a secular republic and the non-establishment of a state religion is hard-baked into our Constitution as part of the First Amendment. The Non-Establishment Clause is, along with the promotion of the freedom of religion, essential to the American political doctrine of the Separation of Church and State and the basis for the individual freedom of religion guaranteed by the Constitution to everyone who lives here.
Further, much of the history of the United States is representative of the usual evil we behold in the world: nation rising against nation, war, discrimination, genocide, slavery, etc. It has taken us a long time, as a nation, to begin the first steps to achieving universal suffrage for the people of this country. In the beginning only land-owning white males had the right to vote, and the process of including more people has been a slow one that has consistently met strong resistance time and again.
America is unique in that it was the first liberal democracy of the modern era. In a time of monarchies, the establishment of a republic built upon democratic and liberal principles that human beings have inalienable rights by the sheer fact that they are human beings, and that a government exists first and foremost for the good and service of the people was, while not invented by America, a revolutionary idea in the 18th century. And America was the first nation of its kind. The ideas upon which America was built are based upon traditions old and new, inspiration from the Roman Republic, the Greek Democracy of Athens, the political philosophies of John Locke and others of the time, as well as inspiration from the Iroquois Confederacy all played major roles in the shaping of American democracy and republicanism.
American values have changed over time, so pinning exactly what "American values" are is a difficult question to answer. But many would agree that there are certain values which we hold in high regard as something to esteem toward: One such value would be that all people are created equal, a value that we have championed since the beginning but which has, from examining our history, been a major and difficult thing. The equality of all people, not just land-owning white men, has been, and continues to be, a driving force behind major changes in American society. When the systems of injustice are made evident, there have been responses, and those responses to injustice have frequently and regularly become the target of hostility and even violence by those who risk possible loss of power by the upsetting of the status quo. It's what we saw with Abolition and Slavery, it's what we saw with the Woman's Suffrage movement, it's what we saw with Civil Rights, it's what we continue to see in the Black Lives Matter, MeToo, and LGBTQ Rights movements. With a great irony being that many who have, both in the past and today, stand against the extending of basic civil liberty, freedom, and justice toward all persons usually rally behind the moral veil of "Values"; ignoring that perhaps the most central value that, in general, has been agreed upon by most Americans over the past nearly 250 years, is what is summed up by Thomas Jefferson in penning those words "All men are created equal". The basic idea that human beings have inherent and inalienable value in and of themselves, and thus are deserving of fair and equitable treatment in a society that claims to value civility, freedom, and justice. A value that has been championed in every generation, frequently betrayed by those in the seats of power, and which continues to serve as an aspiration for Americans in every generation--not just for themselves, but for their neighbor, and for their children.
This, also, would bring me to a second point about the way America is not a Christian nation. America's treatment of "the least of these", historically, has been abysmal. Fundamentally I don't even know how "Christian nation" is even a possibly meaningful term--what would a Christian nation look like? How does a nation become Christian? I know what it means for a person to be a Christian: To have been received into Christ's mystical Body, the Church, through the new birth of Baptism, and having received God's gracious kindness to us that is in Christ, for the forgiveness of our sins, placing our trust in Jesus, and seeking to walk in that faith in obedience to Jesus Christ, and sacrificial service to all creatures. I don't see how such a thing can apply to something as abstract as "nation" however.
-CryptoLutheran