It seems (imo) that alot, if not most, churches these days are way too "sugar coated". All about the feel good-sunshine and what God and Christianity can do for you. What do you think it would take to uproot churches from this pop culture-ish way of presenting/believing Biblical Christianity? Jus' wundering? Thanks in advance for the (credible) replies.