Now we know that some Christians will avoid movies or TV-show that have sexuality or bad language, particularly f-bombing and saying the Lord's name in vain, etc... that's a given. However, as word-of-faith Christians, do shows which deal with sick people in the hospital, or cancer, in and of itself carcinogenic? Does it plant seeds to the viewer that instead of viewing themselves as invincible, because the Spirit of God that rose Jesus from the dead lives in them and they can command any mountain to be moved and cast into the sea with mountain moving faith, and by His stripes you were ALREADY healed, and no plague shall come near thy dwelling, WITH LONG LIFE WILL HE SATISFY THEE AND SHOW YOU HIS SALVATION!!! Do these story-lines run contrary to the word of God and are an abomination? Or does fiction deal with real-life scenarios (ie well people in society get sick, and some people die, that is why there are hospitals and cemeteries around, if you pass by those is that an abomination too?)
When you see a movie like Breakthrough (2019), which I see as a true word of faith movie where a boy was brought back to life because of his believing mother as he suffered a near fatal drowning incident. Movies like that increase your faith.
So the question, is it safe to watch movies or TV-shows involving such a theme or referencing someone dying of that disease or do these shows plant seeds of death in the minds of the viewers, and even the actresses and actors and people involved? Should we be checking their history and see if they caught it themselves? In a sense, are shows depicting this more dangerous than shows with swearing, sexuality and other bad content and the Christian world is misguided in emphasizing the wrong things?
Another thing I noticed, is when Christianity is introduced in those storylines, it's always in the basis of coping rather than curing people by faith. Comforting people and being there when they die so they don't die alone. There is nothing you are seeing of raising from the dead Lazarus faith. The Bible says you lay hands on the sick AND THEY SHALL RECOVER...these are the signs that follow believers.
When you see a movie like Breakthrough (2019), which I see as a true word of faith movie where a boy was brought back to life because of his believing mother as he suffered a near fatal drowning incident. Movies like that increase your faith.
So the question, is it safe to watch movies or TV-shows involving such a theme or referencing someone dying of that disease or do these shows plant seeds of death in the minds of the viewers, and even the actresses and actors and people involved? Should we be checking their history and see if they caught it themselves? In a sense, are shows depicting this more dangerous than shows with swearing, sexuality and other bad content and the Christian world is misguided in emphasizing the wrong things?
Another thing I noticed, is when Christianity is introduced in those storylines, it's always in the basis of coping rather than curing people by faith. Comforting people and being there when they die so they don't die alone. There is nothing you are seeing of raising from the dead Lazarus faith. The Bible says you lay hands on the sick AND THEY SHALL RECOVER...these are the signs that follow believers.