Hello all,
Recently there's been much in the news about the Duggar family, what with the allegations of the older brother's abuse and so on. I wanted to see what everyone on here thought about their theology, their TV show, etc. - whatever else. I do have a suspicion, however, and that is as follows:
It should be no secret that the vast majority of people in the West do not like the real Bible. I think that even many modern Christians also do not like the real Bible or real Biblical theology. Indeed, people enjoy deriding those who take the Bible seriously. I think that part of the reason this show was put on television (and I have to admit I haven't seen even one episode) was to give people an outlet to antagonize and attack Fundamentalist Christianity (i.e., those who take the Bible seriously and study it a lot). In other words, part of the reason this show was put on television was to show people how "absurd" Fundamentalist Christianity is.
Any thoughts?
Recently there's been much in the news about the Duggar family, what with the allegations of the older brother's abuse and so on. I wanted to see what everyone on here thought about their theology, their TV show, etc. - whatever else. I do have a suspicion, however, and that is as follows:
It should be no secret that the vast majority of people in the West do not like the real Bible. I think that even many modern Christians also do not like the real Bible or real Biblical theology. Indeed, people enjoy deriding those who take the Bible seriously. I think that part of the reason this show was put on television (and I have to admit I haven't seen even one episode) was to give people an outlet to antagonize and attack Fundamentalist Christianity (i.e., those who take the Bible seriously and study it a lot). In other words, part of the reason this show was put on television was to show people how "absurd" Fundamentalist Christianity is.
Any thoughts?