Redditors Unhappy with Christianity - how did Christianity negatively impact your life to the extent that you now have a negative opinion of the religion?

I'm not unhappy with Christianity as a concept...what I find disturbing is the disgraceful community of human beings that organized religion has created.

As a kid, going to church was great. It was about fellowship, and learning how to be a better person. It was about having friends from all walks of life, and not judging them for who they were, or where they came from. We were taught the simple tenants. Be a good person. Learn not to judge. Everyone makes mistakes but if we have forgiveness in our hearts, we can move past it all and create a better world.

As you get older, church becomes a popularity contest. Cliques form. The message is lost in the formality of the spectacle. Now you have to dress a certain way, talk a certain way. You have to believe what everyone else believes in. Hatred starts to form. You start to loathe people who dare to think differently from you. This one all inclusive religion begins to separate into smaller and smaller clusters known as denominations.

Love and forgiveness falls by the wayside as fear, and judgement, and intolerance takes over. Some are more tolerant than others, and for that they are ridiculed by the rest. Some turn a blind eye to the things their own members and leaders do...evil things...all the while trying to ignore it.

Soon, the word of Jesus Christ falls on deaf ears. Pastor says this, and pastor says that. The Bible teaches us this. And some where alone the line, it became OK to believe in things that directly contradict what Jesus Christ actually said to his followers. Words that were spoken before Jesus Christ was even born, before Christianity was even started....somehow hold more weight that the words of the man who is the living son of God.

Judge not lest ye be judged and Let he who is without sin cast the first stone have long been forgotten by most of the organized "Christian" denominations. Such simple, unifying theories on how you should live your life are instead replaced with condescension, and disdain for all things "the church" deems evil and a sin.

Somehow, the "Church" became more important than the message.

I truly miss how much I enjoyed going to church as a kid. And I fault not a single person for choosing not to believe because frankly, organized religion is the worst part about having religious beliefs.

/r/AskReddit Thread