I know this topic has been discussed at length multiple times over many years, however, it is one that doesn't seem to get old. There are two sides of the coin strongly voicing their opinion on the matter, and I rarely see the issue lie in the middle. I would like to hear from other Christians what their thoughts are on this!
I have read things from progressive Christians, saying things like there are no such thing as gender roles anymore and no man should tell them what to do. On the opposite side, I have read things like women shouldn't venture out to go to college, get jobs or be in the world outside the home, lest she compete with men and be influenced by the evils of the world. I have read that it goes against God's design for women to have a career, quoting verses such as Titus 2:5 and 1 Timothy 5:14...with some such as "The Transformed Wife" even going so far as to say that it is a sin for a woman to work outside the home.
Let me know your thoughts on this!
I have read things from progressive Christians, saying things like there are no such thing as gender roles anymore and no man should tell them what to do. On the opposite side, I have read things like women shouldn't venture out to go to college, get jobs or be in the world outside the home, lest she compete with men and be influenced by the evils of the world. I have read that it goes against God's design for women to have a career, quoting verses such as Titus 2:5 and 1 Timothy 5:14...with some such as "The Transformed Wife" even going so far as to say that it is a sin for a woman to work outside the home.
Let me know your thoughts on this!