[Serious]Are religions harmful or beneficial?

My “religion” is Christianity. Many times I have seen and heard people abuse the name by portraying behaviors that do not come from a true Christian. It looks terrible and I understand why someone wouldn’t want to be one based off of what the media portrays of Christians. However, a true Christian is someone who believes that Jesus Christ is our Lord and Savior who died on the cross for our sins and rose again in 3 days. If you read the Bible, you will quickly see that the word “love” is mentioned more times than any other recorded book of different religions. To help people and bring them to Jesus. For example: The Fruit of the Holy Spirit is a biblical term that sums up nine attributes of a person living in accord with the Holy Spirit, according to chapter 5 of the Epistle to the Galatians: "But the fruit of the Spirit is love, joy, peace, patience, kindness, goodness, faithfulness, gentleness, and self-control.”

As anyone can see, those are all good things. Almost every other religion involves self harm or sacrifice of some sort. Granted, in the Old Testament, it was to be known that you must sacrifice your best sheep, to be forgiven of your sins. But, that was before Jesus came down and sacrificed HIMSELF so that WE could be forgiven and no longer be covered in sin, but live with Him eternally.

/r/Discussion Thread