Were any of you taught that America is the best country in the world at school?

Yes, this is generally referred to as American Exceptionalism. If not said explicitly, it was and is consistently implied that America is the best country throughout my education. This stems from beliefs including things like military prowess, the US being a beacon of hope, how everyone wants to live here (ex. immigrants), and scientific superiority.

There’s plenty of people who say that the US is the best country in the world while also criticizing it. In education this shows up as covering American atrocities lightly and focusing on positive aspects about how America was special - America defeated Hitler, America brought freedom to people first, America was designed for everyone not just totality, etc. All this exceptionalism means is that when push comes to shove, America is the best. As an aside, I’ve often found this view to be common with immigrants and their more recent descendants, especially ones who fled their country due to lack of freedoms, success, etc. To exceptionalists, patriotism is expected, in part, because to not be grateful would show a lack of gratitude about all that America has given you (an American citizen).

It is important to note that this doesn’t mean the US doesn’t value other countries and cultures. In fact US diversity is often a component of American exceptionalism. Education focus a lot of how America is a melting pot where anyone can succeed. Racism is more of an aberration than a core part of our history. Americans are more likely to hear of something bad happening in another country and say “aren’t you glad you live in the US” than talking poorly about another nation for no reason.

If you’ve gone through the American education system without hearing about American exceptionalism, I would recommend you think about your curriculum in school and what kind of narrative it formed.

/r/AskAnAmerican Thread