I've been a Christian mostly my entire life. But I can't help but notice the more time passes, how negative and dreary Christianity is. A few points that have led me to this conclusion:
The over-emphasis on sin and the dark side of humanity. If taken to the extreme, it could potentially lead to some problems with one's self image.
The emphasis on death, blood, killing (in the Bible), sacrifices and other morbid things.
The downplay of anything good or positive in this world. Christians often forsake the beauty and goodness that this world has to offer, in hoping for the next.
Hellfire and brimstone teaching. God is always angry and ready to condemn people because "He is just", Christians say.
The arrogance and zeal of being religious. 'If you're not of MY particular brand of Christianity, you are going to hell, or at very least, in serious error and are a heretic.'
The condemnation of having money, of trying to have a good life, of desiring a family, a spouse or trying to do something worthwhile with your life. Dreams, in general, are bad, unless it directly involves the spreading of Christianity, it is taught.
The huge emphasis on suffering, trials and tribulations. No one said life is easy, but why would you intentionally TRY to make your life worse? 'It is foolish to suffer deliberately,' mental health professionals tell us.
The justification of religious violence, genocide, natural disasters (as punishments from God), etc.
Marriage and family is generally downplayed, while celibacy is exalted as a spiritual virtue.
Christians tend to neglect any good in their lives simply for 'spiritual' reasons. This world doesn't matter - ONLY eternity matters.
There's probably more, but that is what comes to me off the top of my head. I'm finding it difficult to really appreciate a faith that is so overtly focused on the negative aspects of life. It really seems like a half-empty religion, when I think about it.
The over-emphasis on sin and the dark side of humanity. If taken to the extreme, it could potentially lead to some problems with one's self image.
The emphasis on death, blood, killing (in the Bible), sacrifices and other morbid things.
The downplay of anything good or positive in this world. Christians often forsake the beauty and goodness that this world has to offer, in hoping for the next.
Hellfire and brimstone teaching. God is always angry and ready to condemn people because "He is just", Christians say.
The arrogance and zeal of being religious. 'If you're not of MY particular brand of Christianity, you are going to hell, or at very least, in serious error and are a heretic.'
The condemnation of having money, of trying to have a good life, of desiring a family, a spouse or trying to do something worthwhile with your life. Dreams, in general, are bad, unless it directly involves the spreading of Christianity, it is taught.
The huge emphasis on suffering, trials and tribulations. No one said life is easy, but why would you intentionally TRY to make your life worse? 'It is foolish to suffer deliberately,' mental health professionals tell us.
The justification of religious violence, genocide, natural disasters (as punishments from God), etc.
Marriage and family is generally downplayed, while celibacy is exalted as a spiritual virtue.
Christians tend to neglect any good in their lives simply for 'spiritual' reasons. This world doesn't matter - ONLY eternity matters.
There's probably more, but that is what comes to me off the top of my head. I'm finding it difficult to really appreciate a faith that is so overtly focused on the negative aspects of life. It really seems like a half-empty religion, when I think about it.