- Jun 30, 2020
- 163
- 119
- 19
- Country
- United States
- Faith
- Agnostic
- Marital Status
- Private
Hi, (this is a poorly constructed question)
I have noticed that a tiny minority of Christians (I'm not trying to make a hasty generalization here) don't really trust their doctors. I have heard some stories (and ongoing stories) about people refusing to go to the doctor, trusting that God will heal them of their sickness (in a divine way). Moreover, I have been kinda worried about these people.
I'm no doctor yet (I'm not even a pre-med student yet). However, I just feel a bit worried about this minority group of Christians. But, I don't know if it is right for me to feel this way.
Do Christians consider it a lack of faith (in regards to God) to put trust into Doctors?
I have noticed that a tiny minority of Christians (I'm not trying to make a hasty generalization here) don't really trust their doctors. I have heard some stories (and ongoing stories) about people refusing to go to the doctor, trusting that God will heal them of their sickness (in a divine way). Moreover, I have been kinda worried about these people.
I'm no doctor yet (I'm not even a pre-med student yet). However, I just feel a bit worried about this minority group of Christians. But, I don't know if it is right for me to feel this way.
Do Christians consider it a lack of faith (in regards to God) to put trust into Doctors?