People not from the USA, what were you taught about the USA while you were a kid?

Honestly, the US educational system is very Americacentric <--that should be a word. I remember that in elementary school, social studies taught the same thing every year, which was American History. They would add a little more information as we got older to make it age-appropriate. We learned a little about the Roman Empire, a couple of Chinese empires, the Greeks (Classical) and some European (Great Britain, Spain, France, Germany etc.) countries between 3rd and 8th grade (?) But those lessons were more a way to show us how we, Americans learned from those societies; it was never actually about the different cultures themselves. That's why I call it "Americacentric."

Also, the US educational system prefers to show the good parts of American history, while downplaying the bad. It's one of the reasons we as a country are ignorant about slavery, the Trail of Tears, the Irish indentured servants, the Japanese internment camps, Jim Crow laws, the fact that blacks were legally prevented from buying property in certain neighborhoods up until 1968, and a host of other unsavory things.

Since most of our citizens hate to see ourselves as anything but a "great nation," they wrote our textbooks to reflect that.

We did have the obvious norms such as math, spelling, and science, and language arts. I will say that since I am 35 years old, I didn't have to deal with the tampering of science. I learned about evolution as though it is a real--which it is. I'm lucky, however. Things are changing and it seems that they are trying to regulate textbooks now more than ever. It worries me; what the future generation will learn and believe I mean.

/r/AskReddit Thread