Non-Americans, what misconception did you have about America that you learned was wrong upon visiting/moving to America?

Immediately after 9/11, Muslim and Middle Eastern culture became a topic. What was under the radar, was now at the forefront of everything.

I recall my mom avoiding women with partial head coverings at the grocery store. Quietly saying, almost embarrassed with herself, "You just don't know, you just never know..."

It was a weird time, BUT I think it worked out for the better. When xenophobia is on our laps rather than our subconscious it gets dealt with much more easily.

People were scarred, it was like the AIDS scare and gay people. All of these groups have been quietly discriminated against for a long, long time, but now it's out in the open. It's talked about. That leads to progress. It takes an earthquake to raise up the corpses.

Now my mom shuts down my grandfather who thinks it's cool to loudly broadcast, "Hey look at those queers! They're holding hands!"

Im proud of her.

In the end, multicultural societies are a little messier, but they're better for it. Embrace your heritage man. We're all immigrants in the USA.

/r/AskReddit Thread Parent