What is something that absolutely disgusts you but the majority seems to be okay with?

I'm Catholic and I've been questioning it for about a year and a half now and honestly I don't think I believe it anymore. I don't. Im not saying this is true of all and I respect everyone's right to opinions but I find my own one to be very hypocritical. They/we preach live and kindness for everyone yet ostracise women for having abortions (I'm pro-choice). I mean it's their body and of they can't support a kid, if they were raped or even if they simply don't want it, it doesn't matter it's their body and they should have that option without being treated terribly and told they Andre going to hell. The way many treat homosexuality, it's not Andre n, it's not evil and they aren't going to hell. The way we treat other religions, like calling Jews hypocrites and saying that Muslims worship false prophets etc (let them believe whatever they want).

I love some aspects of religion but in my experience people say they treat everyone with kindness and love but that's just it, they say it; they don't do it.

Anyway, rant over.

/r/AskReddit Thread Parent