Has anyone grown up with a mother that believes that women belong in the kitchen?

My mother has some very sexist views that she tried to push on me. She's always told me that a woman's role is to serve their husband, take care of the kids, clean the house, and cook. And a man's role was to work, discipline the kids, and fix things around the house. I always argued that it didn't make any sense and I didn't want to have to do those things just because I'm a woman. Her only reasoning for why women should do this was "because the Bible says that's a woman's job". She's also one of those people who think that all men are rapists and women don't/shouldn't enjoy sex. According to her, men have sex for pleasure and women should only want to have sex for reproduction.

Needless to say, the experience of living with someone like this was awful for a while. She always started arguments over this and kept pushing that women who don't do these things are awful, lazy wives. She kept this up until I was around 17, but she backed off when she realized I wasn't budging with my beliefs. I've never wanted to be a "traditional" housewife (or even a housewife at all) and I don't want a husband who expects me to do everything for him. We sometimes still do get into arguments about gender roles, but she's not as pushy about it as she was when I was younger.

/r/AskWomen Thread