What is a common myth about your country that is 100% false but many people still believe in it?

I mean, as a US Southerner I know a lot more about the civil war and civil rights movement than people who were not educated in the American South know. I had ancestors fight for the south, one set of grandparents who were on the right side of the civil rights movement (as a result I don't know much of his extended family, they disowned him), the other two were pretty racist but also way too poor to really do anything of note in the civil rights movement (and FWIW before they died they both seemed to realize that racism and homophobia are wrong - I mean they let their granddaughter use their farm to marry her non-white girlfriend so I'd say they evolved). I have a great-uncle who was a cop in Birmingham during the late 60s and 70s and he has a drawer full of confiscated stuff where he just took whatever paraphernalia and let the person go (he did that a lot, he hated arresting people who weren't hurting anyone) - mostly various types of pipes but there's some KKK-branded stuff in there (apparently they sold merchandise?) as well. So while I was born after all of that stuff happened, I know a lot about it that comes from the facts that 1) we studied it fairly intensively in school since it was our history and 2) I actually do know people who were there, or have family members who were involved.

/r/AskReddit Thread Parent