As someone from a non-American country, what American history do you learn in school?

Depends on what State you live in.

I love explaining stuff like this so any follow up questions please ask.

I completed high school in 2010, having gone through multiple States by then in the North and the South.

General history for America typically begins around the time of Jamestown and Plymouth Rock (Mayflower landing). Throughout numerous courses over the years we'd cover the religious influences of the time, what brought colonists to the Americas, how the colonies interacted, and then jump into the Revolution.

War history in America isn't too specific. This is why a lot of Americans have a black/white point of view on international relations because they were never taught how complicated it actually is. As far as the Revolution goes, typically what started it, where Washington went until he cornered Cornwallis in Virginia.

Next things in the South to be covered is the cotton gin. (This is HUGE in the South as the entire South was built on cotton.)

In the North, they focus on the Articles of Confederation that governed the States and then the Constitution. Most of middle school history revolves around the Constitution of the State and the Federal Government.

After those things are covered, into the 1800s which typically covers Manifest Destiny and Western Expansion, things take a turn when we get to the Missouri Compromise of 1850. After that, all focus is on the Civil War and the events leading up to it; bleeding Kansas, the abolitionist movement, and then finally Southern secession.

Both the North and South minimize going in-depth into what actually caused the Civil War instead focusing solely on the debate of slavery, which is a bit debatable, but that's how it goes. The Northern schools demonized the South and focused solely on Northern victory; the South instead of demonizing anyone recognized the valor of both side's Soldiers and, at least curriculum wise, accepted defeat.

Neither Northern or Southern schools covered the Reconstruction era after the war very well.

After that, North jumps into the early 1900s and industrialization, the South focuses on the advancement of black culture of the late 1800s, but they tend to meet up just prior to World War I. After the World Wars are over, high school barely covers the Cold War barely mentioning Korea but focusing on the failures of Vietnam.

In the South, the Cold War is depicted as solely victories of America, such as the Cuban Missile crisis during which America actually backed down before the Soviets. Oh well.

The 90s are covered as the rise of American economy and the rise of extremism, 9/11 was the very last thing in my high school text book.

Any more questions please ask.

/r/AskReddit Thread