What’s a thing that all women do but never admit it?

You're completely missing my point.

Would you feel good covering your body with colored dirt? Because that's what some african tribes do and I bet they feel good doing it.

I'm not saying you don't feel good, I'm not saying don't do it. It's completely normal to want to feel beautifully and it's normal to feel good when feeling beautiful. I just hate that all beauty.

I'm just saying that most beauty concepts nowadays come from institutions with the single goal of making money and they were very successful at associating beauty with rituals that makes us buy stuff. I don't get why this is so controversial, you think it's ok for corporations to indoctrinate children on what it's beauty? Self-esteem is at all time lows because of this. It stopped being "just be yourself" to "it's for yourself"...

/r/AskReddit Thread Parent