I hear the two older generations say "America was Christian back then". But the older I get, the more I believe it wasn't.
I'm thinking about starting a topic in the Debate section on this...if it doesn't already exist.
Faith and society have both changed. Historically it’s probably fair to say that faith influenced society more than the reverse, but we definitely see the opposite now.
Christian faith, where it even exists, is often a watered-down mishmash of beliefs that is pretty hollow at best, or outright heretical at worst.
There’s seemingly very little conviction anymore. I think the idea of faith and Christianity are appealing to many because they get a sense of belonging, but that’s often accompanied by nonsense like universalism and other warm fuzzy feeling theology that are completely at odds with scripture.
Upvote
0