I have been pondering this question for a little while now and was wondering what others thought. Lately I have noticed that more people, organizations and governments are talking about the irrelevance of religion or religious belief. I am mainly talking about Christianity as in my country Australia we seem to be moving away from it pretty fast. Considering that we were built on Christian values it seems that we are forgetting our foundations.
I think this has been spurred on from the troubles we have had with some of the extremist groups like ISIS which is giving all religion a bad name. I know that there has been less people involved in religion over the pasts few decades and today's young generation are the least religious. I think at least in Australia because we are a multicultural country we are seeing other religions come in and the government is saying we can have any one religion being dominant. They are being too politically correct and saying it may offend other religions if we promote our own christian beliefs.
So now we can't have traditional celebrations like Christmas or Easter and now it is not being acknowledged anymore. In fact the commercial aspect of these occasions is becoming bigger and kids are more likely to believe in Santa Clause. It seems some are even getting angry or intolerant of any religion now and see it as not being relevant in today's world. Do you think this trend will continue and will Christianity eventually be totally removed from all public places.
I think this has been spurred on from the troubles we have had with some of the extremist groups like ISIS which is giving all religion a bad name. I know that there has been less people involved in religion over the pasts few decades and today's young generation are the least religious. I think at least in Australia because we are a multicultural country we are seeing other religions come in and the government is saying we can have any one religion being dominant. They are being too politically correct and saying it may offend other religions if we promote our own christian beliefs.
So now we can't have traditional celebrations like Christmas or Easter and now it is not being acknowledged anymore. In fact the commercial aspect of these occasions is becoming bigger and kids are more likely to believe in Santa Clause. It seems some are even getting angry or intolerant of any religion now and see it as not being relevant in today's world. Do you think this trend will continue and will Christianity eventually be totally removed from all public places.