What are the brightest and darkest periods in American history?

I hate to be the pedant, and I'm honestly not trying to be a dick, but are we talking the continental history or the United States?

I'd offer up post-WWII as a transformative time in the U.S. There was a lot of new hope, especially given the previous era of depression. It was the birth of the middle class. All kinds of cool things happening for people. After a time many of the "cool" things turned out to be horrible and deeply destructive, but people weren't immediately on to the small and large ways they were impacting their society and environment. They were psyched. Buying houses and cars. Going to the beach. Traveling the new interstate highways. I'd offer Sputnik as the end of that period.

The darkest period is perhaps in the pre-U.S. colonial era. It's a grotesque comparison to draw, but I'd offer the outright genocide of that colonial era is the darkest time. Ancient societies were just dusted off the map. We pay a certain amount of lip-service to the horror visited upon aboriginal Americans, but the true destruction is, mercifully beyond our knowledge and comprehensions. I'm always tempted to call upon a frequent counterfactual plot for sci-fi writers: what if Hitler won the war? In some ways (not all, but some that are important) the United States tells the story of that counterfactual. What if one culture wanted to erase another culture from existence? The United States is an example of this process. Hitler killed people he felt were lower orders of human than himself, whereas colonialists killed native Americans for land. And though they too certainly felt the native Americans were lower forms of human, one could argue that Europe's obliteration of pre-contact North American's "wasn't personal."

Lots of generalization there, but I would offer those two periods as an answer to the question.

/r/AskAnAmerican Thread