Kaon
Well-Known Member
- Mar 12, 2018
- 5,676
- 2,350
- Country
- United States
- Faith
- Other Religion
- Marital Status
- Celibate
I read an article about the decline of Christianity in the UK and few Google searches indicated that this trend is happening all over the western world. So, is it inevitable? Is it a sign of the End Times? What is the future of Christianity?
Personally I think the decline of Christianity is being contrived by Luciferian forces. The political/media/industrial elite, though often masquerading as Christians, are actively trying to destroy it.
I think more people are rejecting dogmatic institutions, and are trying to find organic understanding of the Most High God. I that respect, I think it is a beautiful thing if it is perceived that there is a decline in Christianity in the West.
But, people are also being misled and swindled by the institution - so they become disillusioned and leave. Moreover, now that the West accepts homosexuality, more people are leaving the institution that highlights their sin more than others (like adultery, gossip or lying).
It is a general falling away from institution, but those who are of the Most High God never stay down/fail even if they get lost - if that is what is happening.
Upvote
0