[Serious] If a culture is traditionally ruled by men, why should they allow women any rights? In attempting to be progressive, how could outsiders influence said culture to allow women to become equal members of the society?

I’m all for these women making change and finding their way to that place in society, but most are hardly going to have a say on what goes on in their culture.

I'll never not laugh at people's dismally low expectations of women. Y'all really think women play zero role in shaping those cultures just because it looks like to you on the outside that they're getting the short end of the stick. It's almost biologically impossible for human women not to have humongous impact on any culture. Women have far too much influence on human beings developing years for that to be the case. They spend the majority of time with children. Especially in those cultures. Again, do you know that female genital mutilations are mostly performed by other women? But of course to people with such low expectations of women, they have no agency. At least if they're doing anything negative that affects other women. Then they off load that agency onto men. Human beings naturally do this. We have some sort of impulse to create masculine figures that are responsible for everything. It's quite fascinating.

/r/AskReddit Thread Parent