Hello all
I have a question for everyone. I grew up in church all my life. My father was a Baptist minister and so I was in church as soon as I was able. Therefore, I've gone through what I am sure alot of you have as well. In church I was taught to believe in God, but I was never taught why. No attempts at really establishing a belief in God were made. I hate to say this, because there are alot of wonderful people I met in my church growing up, but more than anything else I feel like I was just told the doctrines of the Church and that I should believe them.
So, by the time I was a senior in high school, for the first time I really examined my belief in God and realized there was nothing there. I realized that I only believed in God because I had been told to do so, and I had no real belief whatsoever.
And I was terrified.
I tell you the truth, I was scared to death. I didn't know what to do. I had put so much stock in something that I really thought was probably untrue. I didn't talk about it with anyone because I was so afraid, and I really think that for a while there I could have reasonably called myself an atheist, or at least an agnostic. The thing is, I have always been an intellectual and somewhat of a skeptic, so I had many unanswered questions. When I realized my church had not answered any of these questions, I assumed there weren't any answers to them and I had to just go on blind faith. Surely they would have addressed this if it was this important. Right???
Since that time, I have searched for my answers, and while I haven't found them all, I have found a rational basis for my belief, in no small part thanks to some very wise people here at CF. Faith is still part of the issue of course, and always will be, but it is not blind anymore.
My question is this. Why didn't my church address this at some point? I am sure there are many like me that didn't get the intellectual basis for their beliefs from church, and maybe some of them have turned from it because of this. Shouldn't this scare churches enough that they feel they should address this issue?
My main problem with the church was that they stressed some sort of herd mentality--just agree with me when I tell you to believe in God, and agree with all my conceptions of God and you'll be OK. One of the most important things I've learned since that scary time is that everyone's faith is their own; everyone has different ideas of what it means to be a Christian. Shouldn't we be teaching this in churches as well? When I realized I didn't agree with some things other Christians thought, this scared me as well. Now I realize that is OK. My church never mentioned this.
So what do you guys think?? Are Churches making a mistake in not commenting on these issues??
I have a question for everyone. I grew up in church all my life. My father was a Baptist minister and so I was in church as soon as I was able. Therefore, I've gone through what I am sure alot of you have as well. In church I was taught to believe in God, but I was never taught why. No attempts at really establishing a belief in God were made. I hate to say this, because there are alot of wonderful people I met in my church growing up, but more than anything else I feel like I was just told the doctrines of the Church and that I should believe them.
So, by the time I was a senior in high school, for the first time I really examined my belief in God and realized there was nothing there. I realized that I only believed in God because I had been told to do so, and I had no real belief whatsoever.
And I was terrified.
I tell you the truth, I was scared to death. I didn't know what to do. I had put so much stock in something that I really thought was probably untrue. I didn't talk about it with anyone because I was so afraid, and I really think that for a while there I could have reasonably called myself an atheist, or at least an agnostic. The thing is, I have always been an intellectual and somewhat of a skeptic, so I had many unanswered questions. When I realized my church had not answered any of these questions, I assumed there weren't any answers to them and I had to just go on blind faith. Surely they would have addressed this if it was this important. Right???
Since that time, I have searched for my answers, and while I haven't found them all, I have found a rational basis for my belief, in no small part thanks to some very wise people here at CF. Faith is still part of the issue of course, and always will be, but it is not blind anymore.
My question is this. Why didn't my church address this at some point? I am sure there are many like me that didn't get the intellectual basis for their beliefs from church, and maybe some of them have turned from it because of this. Shouldn't this scare churches enough that they feel they should address this issue?
My main problem with the church was that they stressed some sort of herd mentality--just agree with me when I tell you to believe in God, and agree with all my conceptions of God and you'll be OK. One of the most important things I've learned since that scary time is that everyone's faith is their own; everyone has different ideas of what it means to be a Christian. Shouldn't we be teaching this in churches as well? When I realized I didn't agree with some things other Christians thought, this scared me as well. Now I realize that is OK. My church never mentioned this.
So what do you guys think?? Are Churches making a mistake in not commenting on these issues??