What is a prevalent and popular mindset/opinion on Reddit that you disagree with?

I have to say it's this idea that you need to see a doctor for the slightest bit of pain or discomfort. Not sure exactly why this is...may be because Reddit is fairly youthful and educated, and they put a lot of stock into doctors. Also, it is so easy to Google a symptom and end up freaked out that you have cancer. This is hard for me because I used to be a major hypochondriac that saw a doctor for everything and I'm only recently getting better.

But I see this all of the time on Reddit. Headache? Doctor. Stomach hurts? Doctor. Paper cut? Doctor. Not saying people should never go to the doctor, but you know....most aches and pains will go away on their own.

Besides pains usually being able to clear up on their own, doctors can exaggerate or unnecessarily alarm you to give you prescriptions you don't need. This is how I started out getting hooked on a bad medication as a teen. I don't know how many times I've left an office getting like 6 scripts and only needing maybe one at the most.

Also, doctors are purposely capable of misdiagnosing, especially with how little time they usually give patients. You can often be more accurate with research of your own than talking to a rushed doctor for 2 minutes. Since 99.9% what you have isn't life-threatening, it will go away on its own if they misdiagnose or worse case scenario you have to come back a few days later, giving them more business and it's no big deal to them. Your time isn't a priority to them.

Doctors can definitely be useful and necessary, OBVIOUSLY, but I think they often get treated like some omnipotent beings that will solve everything for you always.

/r/AskMen Thread