You hear how Hollywood promotes a lot of things that are against the Bible. In my opinion it seems to be getting worse. It seems that Hollywood wants to get the public away from God more and more. Also, most of the celebs (young and old) that are heavely in the spotlight are the celebs that do not seem to support Jesus or the Bible.
What are your thoughts on this topic, do you think that Hollywood has a big part in helping the Anti-Christ?
What are your thoughts on this topic, do you think that Hollywood has a big part in helping the Anti-Christ?